Skip to content

Instantly share code, notes, and snippets.

@cyysky
Created April 11, 2025 04:41
Show Gist options
  • Save cyysky/595c459b62dd6a34013f85e118e25a73 to your computer and use it in GitHub Desktop.
Save cyysky/595c459b62dd6a34013f85e118e25a73 to your computer and use it in GitHub Desktop.
Browser-use: Connecting AI Agents with the Browser

Browser-use: Connecting AI Agents with the Browser

🌐 Browser-use is the easiest way to connect your AI agents with the browser. With pip (Python>=3.11):

pip install browser-use

Install Playwright:

playwright install chromium

Getting Started

This project demonstrates a simple example of using browser-use to interact with an AI model.

Requirements

  • Python 3.11 or higher
  • browser-use
  • Playwright (Chromium)

Example Usage

The following code demonstrates a basic example:

browser_agent.py

from langchain_ollama import ChatOllama
from browser_use import Agent
import asyncio
from dotenv import load_dotenv
load_dotenv()

# Initialize the model
llm=ChatOllama(model="gemma3:4b", num_ctx=32000,base_url="http://localhost:11434")

async def main():
    # Create agent with the model
    agent = Agent(
        task="Compare the price of gpt-4o and DeepSeek-V3",
        llm=llm
    )
    await agent.run()

asyncio.run(main())

This script initializes a ChatOllama model and creates an Agent object. The agent is configured to compare the prices of gpt-4o and DeepSeek-V3. It then runs the agent, which will interact with the model to perform the comparison.

langchain_ollama_sample.py

from langchain_ollama import ChatOllama

llm = ChatOllama(
    model="gemma3:4b",
    temperature=1,
    base_url="http://127.0.0.1:11434",
    #Other args
)

messages = [
    (
        "system",
        "You are a helpful assistant that translates English to French. Translate the user sentence.",
    ),
    ("human", "I love programming."),
]
ai_msg = llm.invoke(messages)
print(ai_msg)

This script demonstrates a simple translation task using ChatOllama. It sets up a system message to define the assistant's role, provides a user sentence, and then invokes the model to translate the sentence. The translated message is then printed to the console.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment