You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Right now, you can theoretically use the stagehand.llmClient object to access the underlying LLM for Stagehand.
This can be helpful for times when you want to ask the LLM custom questions that can guide your next step -- for example, if you're writing a Stagehand script, you might want to ask the LLM for what word to input next. This is possible right now, but it's really cumbersome:
It would be awesome to have stagehand.llmClient.generateText() and stagehand.llmClient.generateObject() like the Vercel AI SDK. Bonus points if you add streamText :)
The text was updated successfully, but these errors were encountered:
Yeah @kamath, I think we can wrap our new functionality around the vercel ai's generateText() ,generateObject() ,streamText(), streamObject() methods. can i work on this issue?
Right now, you can theoretically use the
stagehand.llmClient
object to access the underlying LLM for Stagehand.This can be helpful for times when you want to ask the LLM custom questions that can guide your next step -- for example, if you're writing a Stagehand script, you might want to ask the LLM for what word to input next. This is possible right now, but it's really cumbersome:
It would be awesome to have
stagehand.llmClient.generateText()
andstagehand.llmClient.generateObject()
like the Vercel AI SDK. Bonus points if you addstreamText
:)The text was updated successfully, but these errors were encountered: