Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
29 commits
Select commit Hold shift + click to select a range
f49e442
feat: update OpenAI model parameters handling for reasoning models
ogabrielluiz Jun 9, 2025
914fc91
feat: extend input_value type in LCModelComponent to support AsyncIte…
ogabrielluiz Jun 9, 2025
3d7e76c
refactor: remove assert_streaming_sequence method and related checks …
ogabrielluiz Jun 9, 2025
9dd9205
feat: add consume_iterator method to Message class for handling itera…
ogabrielluiz Jun 9, 2025
0623d6b
test: add unit tests for OpenAIModelComponent functionality and integ…
ogabrielluiz Jun 9, 2025
996a3bf
feat: update OpenAIModelComponent to include temperature and seed par…
ogabrielluiz Jun 9, 2025
600ece6
feat: rename consume_iterator method to consume_iterator_in_text and …
ogabrielluiz Jun 9, 2025
b029115
feat: add is_connected_to_chat_output method to Component class for i…
ogabrielluiz Jun 11, 2025
d576d96
feat: refactor LCModelComponent methods to support asynchronous messa…
ogabrielluiz Jun 11, 2025
abc3891
refactor: remove consume_iterator_in_text method from Message class a…
ogabrielluiz Jun 11, 2025
aa24bee
fix: update import paths for input components in multiple starter pro…
ogabrielluiz Jun 11, 2025
452dcb2
fix: enhance error message formatting in ErrorMessage class to handle…
ogabrielluiz Jun 11, 2025
1398034
refactor: remove validate_stream calls from generate_flow_events and …
ogabrielluiz Jun 11, 2025
987f912
fix: handle asyncio.CancelledError in aadd_messagetables to ensure pr…
ogabrielluiz Jun 11, 2025
929fbc2
refactor: streamline message handling in LCModelComponent by replacin…
ogabrielluiz Jun 11, 2025
3714612
refactor: enhance message handling in LCModelComponent by introducing…
ogabrielluiz Jun 12, 2025
cb9ba56
feat: add _build_source method to Component class for enhanced source…
ogabrielluiz Jun 12, 2025
037e553
feat: enhance LCModelComponent by adding _handle_stream method for im…
ogabrielluiz Jun 12, 2025
614fac4
feat: update MemoryComponent to enhance message retrieval and storage…
ogabrielluiz Jun 12, 2025
a1d8421
Merge branch 'main' into improve-streaming
ogabrielluiz Jun 12, 2025
d031c91
test: refactor LanguageModelComponent tests to use ComponentTestBaseW…
ogabrielluiz Jun 12, 2025
1547e9f
test: add fixtures for API keys and implement live API tests for Open…
ogabrielluiz Jun 12, 2025
fdb1ad5
Merge branch 'main' into improve-streaming
ogabrielluiz Jun 23, 2025
0aa9519
fix: reorder JSON properties for consistency in starter projects
ogabrielluiz Jun 23, 2025
663e47e
refactor: simplify input_value type in LCModelComponent
ogabrielluiz Jun 25, 2025
1fd3656
fix: clarify comment for handling source in Component class
ogabrielluiz Jun 25, 2025
2755422
Merge branch 'main' into improve-streaming
ogabrielluiz Jun 25, 2025
c1b8d5f
refactor: remove unnecessary mocking in OpenAI model integration tests
ogabrielluiz Jun 25, 2025
dafb3c0
Merge branch 'main' into improve-streaming
ogabrielluiz Jun 25, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Prev Previous commit
Next Next commit
fix: update import paths for input components in multiple starter pro…
…ject JSON files
  • Loading branch information
ogabrielluiz committed Jun 12, 2025
commit aa24bee5fc157bf23e3ad499883921cbf38dcd53

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
Expand Up @@ -981,7 +981,7 @@
"show": true,
"title_case": false,
"type": "code",
"value": "from typing import Any\n\nfrom langchain_openai import ChatOpenAI\nfrom pydantic.v1 import SecretStr\n\nfrom langflow.base.models.model import LCModelComponent\nfrom langflow.base.models.openai_constants import (\n OPENAI_MODEL_NAMES,\n OPENAI_REASONING_MODEL_NAMES,\n)\nfrom langflow.field_typing import LanguageModel\nfrom langflow.field_typing.range_spec import RangeSpec\nfrom langflow.inputs import BoolInput, DictInput, DropdownInput, IntInput, SecretStrInput, SliderInput, StrInput\nfrom langflow.logging import logger\n\n\nclass OpenAIModelComponent(LCModelComponent):\n display_name = \"OpenAI\"\n description = \"Generates text using OpenAI LLMs.\"\n icon = \"OpenAI\"\n name = \"OpenAIModel\"\n\n inputs = [\n *LCModelComponent._base_inputs,\n IntInput(\n name=\"max_tokens\",\n display_name=\"Max Tokens\",\n advanced=True,\n info=\"The maximum number of tokens to generate. Set to 0 for unlimited tokens.\",\n range_spec=RangeSpec(min=0, max=128000),\n ),\n DictInput(\n name=\"model_kwargs\",\n display_name=\"Model Kwargs\",\n advanced=True,\n info=\"Additional keyword arguments to pass to the model.\",\n ),\n BoolInput(\n name=\"json_mode\",\n display_name=\"JSON Mode\",\n advanced=True,\n info=\"If True, it will output JSON regardless of passing a schema.\",\n ),\n DropdownInput(\n name=\"model_name\",\n display_name=\"Model Name\",\n advanced=False,\n options=OPENAI_MODEL_NAMES + OPENAI_REASONING_MODEL_NAMES,\n value=OPENAI_MODEL_NAMES[1],\n combobox=True,\n real_time_refresh=True,\n ),\n StrInput(\n name=\"openai_api_base\",\n display_name=\"OpenAI API Base\",\n advanced=True,\n info=\"The base URL of the OpenAI API. \"\n \"Defaults to https://api.openai.com/v1. \"\n \"You can change this to use other APIs like JinaChat, LocalAI and Prem.\",\n ),\n SecretStrInput(\n name=\"api_key\",\n display_name=\"OpenAI API Key\",\n info=\"The OpenAI API Key to use for the OpenAI model.\",\n advanced=False,\n value=\"OPENAI_API_KEY\",\n required=True,\n ),\n SliderInput(\n name=\"temperature\",\n display_name=\"Temperature\",\n value=0.1,\n range_spec=RangeSpec(min=0, max=1, step=0.01),\n show=True,\n ),\n IntInput(\n name=\"seed\",\n display_name=\"Seed\",\n info=\"The seed controls the reproducibility of the job.\",\n advanced=True,\n value=1,\n ),\n IntInput(\n name=\"max_retries\",\n display_name=\"Max Retries\",\n info=\"The maximum number of retries to make when generating.\",\n advanced=True,\n value=5,\n ),\n IntInput(\n name=\"timeout\",\n display_name=\"Timeout\",\n info=\"The timeout for requests to OpenAI completion API.\",\n advanced=True,\n value=700,\n ),\n ]\n\n def build_model(self) -> LanguageModel: # type: ignore[type-var]\n parameters = {\n \"api_key\": SecretStr(self.api_key).get_secret_value() if self.api_key else None,\n \"model_name\": self.model_name,\n \"max_tokens\": self.max_tokens or None,\n \"model_kwargs\": self.model_kwargs or {},\n \"base_url\": self.openai_api_base or \"https://api.openai.com/v1\",\n \"max_retries\": self.max_retries,\n \"timeout\": self.timeout,\n }\n\n logger.info(f\"Model name: {self.model_name}\")\n if self.model_name not in OPENAI_REASONING_MODEL_NAMES:\n parameters[\"temperature\"] = self.temperature if self.temperature is not None else 0.1\n parameters[\"seed\"] = self.seed\n\n output = ChatOpenAI(**parameters)\n if self.json_mode:\n output = output.bind(response_format={\"type\": \"json_object\"})\n\n return output\n\n def _get_exception_message(self, e: Exception):\n \"\"\"Get a message from an OpenAI exception.\n\n Args:\n e (Exception): The exception to get the message from.\n\n Returns:\n str: The message from the exception.\n \"\"\"\n try:\n from openai import BadRequestError\n except ImportError:\n return None\n if isinstance(e, BadRequestError):\n message = e.body.get(\"message\")\n if message:\n return message\n return None\n\n def update_build_config(self, build_config: dict, field_value: Any, field_name: str | None = None) -> dict:\n if field_name in {\"base_url\", \"model_name\", \"api_key\"} and field_value in OPENAI_REASONING_MODEL_NAMES:\n build_config[\"temperature\"][\"show\"] = False\n build_config[\"seed\"][\"show\"] = False\n if field_name in {\"base_url\", \"model_name\", \"api_key\"} and field_value in OPENAI_MODEL_NAMES:\n build_config[\"temperature\"][\"show\"] = True\n build_config[\"seed\"][\"show\"] = True\n return build_config\n"
"value": "from typing import Any\n\nfrom langchain_openai import ChatOpenAI\nfrom pydantic.v1 import SecretStr\n\nfrom langflow.base.models.model import LCModelComponent\nfrom langflow.base.models.openai_constants import (\n OPENAI_MODEL_NAMES,\n OPENAI_REASONING_MODEL_NAMES,\n)\nfrom langflow.field_typing import LanguageModel\nfrom langflow.field_typing.range_spec import RangeSpec\nfrom langflow.inputs.inputs import BoolInput, DictInput, DropdownInput, IntInput, SecretStrInput, SliderInput, StrInput\nfrom langflow.logging import logger\n\n\nclass OpenAIModelComponent(LCModelComponent):\n display_name = \"OpenAI\"\n description = \"Generates text using OpenAI LLMs.\"\n icon = \"OpenAI\"\n name = \"OpenAIModel\"\n\n inputs = [\n *LCModelComponent._base_inputs,\n IntInput(\n name=\"max_tokens\",\n display_name=\"Max Tokens\",\n advanced=True,\n info=\"The maximum number of tokens to generate. Set to 0 for unlimited tokens.\",\n range_spec=RangeSpec(min=0, max=128000),\n ),\n DictInput(\n name=\"model_kwargs\",\n display_name=\"Model Kwargs\",\n advanced=True,\n info=\"Additional keyword arguments to pass to the model.\",\n ),\n BoolInput(\n name=\"json_mode\",\n display_name=\"JSON Mode\",\n advanced=True,\n info=\"If True, it will output JSON regardless of passing a schema.\",\n ),\n DropdownInput(\n name=\"model_name\",\n display_name=\"Model Name\",\n advanced=False,\n options=OPENAI_MODEL_NAMES + OPENAI_REASONING_MODEL_NAMES,\n value=OPENAI_MODEL_NAMES[1],\n combobox=True,\n real_time_refresh=True,\n ),\n StrInput(\n name=\"openai_api_base\",\n display_name=\"OpenAI API Base\",\n advanced=True,\n info=\"The base URL of the OpenAI API. \"\n \"Defaults to https://api.openai.com/v1. \"\n \"You can change this to use other APIs like JinaChat, LocalAI and Prem.\",\n ),\n SecretStrInput(\n name=\"api_key\",\n display_name=\"OpenAI API Key\",\n info=\"The OpenAI API Key to use for the OpenAI model.\",\n advanced=False,\n value=\"OPENAI_API_KEY\",\n required=True,\n ),\n SliderInput(\n name=\"temperature\",\n display_name=\"Temperature\",\n value=0.1,\n range_spec=RangeSpec(min=0, max=1, step=0.01),\n show=True,\n ),\n IntInput(\n name=\"seed\",\n display_name=\"Seed\",\n info=\"The seed controls the reproducibility of the job.\",\n advanced=True,\n value=1,\n ),\n IntInput(\n name=\"max_retries\",\n display_name=\"Max Retries\",\n info=\"The maximum number of retries to make when generating.\",\n advanced=True,\n value=5,\n ),\n IntInput(\n name=\"timeout\",\n display_name=\"Timeout\",\n info=\"The timeout for requests to OpenAI completion API.\",\n advanced=True,\n value=700,\n ),\n ]\n\n def build_model(self) -> LanguageModel: # type: ignore[type-var]\n parameters = {\n \"api_key\": SecretStr(self.api_key).get_secret_value() if self.api_key else None,\n \"model_name\": self.model_name,\n \"max_tokens\": self.max_tokens or None,\n \"model_kwargs\": self.model_kwargs or {},\n \"base_url\": self.openai_api_base or \"https://api.openai.com/v1\",\n \"max_retries\": self.max_retries,\n \"timeout\": self.timeout,\n }\n\n logger.info(f\"Model name: {self.model_name}\")\n if self.model_name not in OPENAI_REASONING_MODEL_NAMES:\n parameters[\"temperature\"] = self.temperature if self.temperature is not None else 0.1\n parameters[\"seed\"] = self.seed\n\n output = ChatOpenAI(**parameters)\n if self.json_mode:\n output = output.bind(response_format={\"type\": \"json_object\"})\n\n return output\n\n def _get_exception_message(self, e: Exception):\n \"\"\"Get a message from an OpenAI exception.\n\n Args:\n e (Exception): The exception to get the message from.\n\n Returns:\n str: The message from the exception.\n \"\"\"\n try:\n from openai import BadRequestError\n except ImportError:\n return None\n if isinstance(e, BadRequestError):\n message = e.body.get(\"message\")\n if message:\n return message\n return None\n\n def update_build_config(self, build_config: dict, field_value: Any, field_name: str | None = None) -> dict:\n if field_name in {\"base_url\", \"model_name\", \"api_key\"} and field_value in OPENAI_REASONING_MODEL_NAMES:\n build_config[\"temperature\"][\"show\"] = False\n build_config[\"seed\"][\"show\"] = False\n if field_name in {\"base_url\", \"model_name\", \"api_key\"} and field_value in OPENAI_MODEL_NAMES:\n build_config[\"temperature\"][\"show\"] = True\n build_config[\"seed\"][\"show\"] = True\n return build_config\n"
},
"input_value": {
"_input_type": "MessageInput",
Expand Down
Loading