LaoZhang API fully supports OpenAI’s latest Responses API, the next-generation agent-building interface introduced in March 2025. Responses API combines the simplicity of Chat Completions with the tool usage and state management capabilities of Assistants API, providing developers with a more flexible and powerful AI application development experience.
Next-Gen API: Responses API is a superset of Chat Completions, providing all Chat Completions features plus advanced capabilities like built-in tools and state management. However, it only supports select new OpenAI models - see details below.
curl https://api.laozhang.ai/v1/responses \ -H "Content-Type: application/json" \ -H "Authorization: Bearer YOUR_API_KEY" \ -d '{ "model": "gpt-4.1", "input": "Hello! How can you help me today?", "instructions": "You are a helpful assistant." }'
response = client.responses.create( model="gpt-4.1", input="Create a chart showing sales data: Jan:100, Feb:150, Mar:120", instructions="You are a data analyst. Use code interpreter to create visualizations.", tools=[{"type": "code_interpreter"}])
response = client.responses.create( model="gpt-4.1", input="Search for information about quarterly reports", instructions="You are a document analyst.", tools=[{"type": "file_search"}])
{/* First conversation round */}response1 = client.responses.create( model="gpt-4.1", input="My name is Alice. Please remember this.", instructions="You are a helpful assistant with good memory."){/* Second round - use previous_response_id to maintain context */}response2 = client.responses.create( model="gpt-4.1", input="What's my name?", instructions="You are a helpful assistant with good memory.", previous_response_id=response1.id)print(response2.output[0].content[0].text) {/* Should answer "Alice" */}
def multi_turn_conversation(): response_id = None for user_input in ["What's 2+2?", "Now multiply that by 3", "And divide by 2"]: response = client.responses.create( model="o3", input=user_input, instructions="You are a math tutor. Show your reasoning.", previous_response_id=response_id, tools=[{"type": "code_interpreter"}] ) print(f"User: {user_input}") print(f"Assistant: {response.output[0].content[0].text}") response_id = response.id {/* Maintain context */}
Reasoning models have special advantages in Responses API:
Copy
{/* Use O3 for complex reasoning */}response = client.responses.create( model="o3", input="Solve this step by step: If a train travels 120km in 2 hours, then speeds up 20% for the next hour, how far did it travel in total?", instructions="Think through this problem step by step, showing all reasoning."){/* View reasoning process */}reasoning_tokens = response.usage.output_tokens_details.reasoning_tokensprint(f"Reasoning tokens used: {reasoning_tokens}"){/* Continue conversation, reasoning context persists */}follow_up = client.responses.create( model="o3", input="Now what if the train slowed down 10% in the fourth hour?", previous_response_id=response.id)
response = client.responses.create( model="gpt-4.1", input="Get weather for New York and Los Angeles, then calculate travel time between them", instructions="You are a travel assistant.", parallel_tool_calls=True, tools=[ {"type": "function", "function": {"name": "get_weather", ...}}, {"type": "function", "function": {"name": "calculate_distance", ...}} ])
{ "error": { "type": "invalid_request_error", "code": "model_not_supported", "message": "The model 'gpt-3.5-turbo' is not supported for the responses endpoint.", "param": "model" }}
def smart_tool_calling(user_input): {/* Intelligently select tools based on input */} available_tools = [] if "weather" in user_input.lower(): available_tools.append(weather_tool) if "calculate" in user_input.lower(): available_tools.append(calculator_tool) if "search" in user_input.lower(): available_tools.append(search_tool) response = client.responses.create( model="gpt-4.1", input=user_input, instructions="Use the appropriate tools to help the user.", tools=available_tools, tool_choice="auto" ) return response
Development Recommendation: New projects should use Responses API directly, existing projects can migrate gradually. LaoZhang API will continue to follow OpenAI updates to ensure feature completeness.