Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[llm unify 5c/n] jinjify extract properties #1169

Merged
merged 21 commits into from
Feb 10, 2025
Merged

[llm unify 5c/n] jinjify extract properties #1169

merged 21 commits into from
Feb 10, 2025

Conversation

HenryL27
Copy link
Collaborator

No description provided.

Signed-off-by: Henry Lindeman <[email protected]>
Signed-off-by: Henry Lindeman <[email protected]>
Signed-off-by: Henry Lindeman <[email protected]>
Signed-off-by: Henry Lindeman <[email protected]>
Signed-off-by: Henry Lindeman <[email protected]>
Signed-off-by: Henry Lindeman <[email protected]>
Signed-off-by: Henry Lindeman <[email protected]>
Signed-off-by: Henry Lindeman <[email protected]>
Copy link
Contributor

@dhruvkaliraman7 dhruvkaliraman7 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

few nits, but lgtm!

)

PropertiesFromSchemaJinjaPrompt = JinjaPrompt(
system="You are given text contents from a document.",
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this the correct system prompt?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

afaict, yep.

class ExtractPropertiesFromSchemaPrompt(SimplePrompt):

@@ -15,6 +15,7 @@ def _infer_prompts(
if llm_mode == LLMMode.SYNC:
res = []
for p in prompts:
print(p)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove

if self._schema_name is not None:
prompt = prompt.set(entity=self._schema_name)
prompt = prompt.set(num_elements=self._num_of_elements)
if self._prompt_formatter is not element_list_formatter:
prompt = prompt.set(prompt_formatter=self._prompt_formatter)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not setting for element_list_formatter because you're letting jinja fragments do the work? I'm assuming you would just remove element_list_formatter once you refactor LLM_schema_extractor?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yep. will aim to drop it when it's no longer used

Signed-off-by: Henry Lindeman <[email protected]>
@HenryL27 HenryL27 enabled auto-merge (squash) February 10, 2025 23:37
@HenryL27 HenryL27 merged commit aa0d7a3 into main Feb 10, 2025
11 of 15 checks passed
@HenryL27 HenryL27 deleted the hml-jinjaprompting branch February 10, 2025 23:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants