URL Query Parameters
MyLinks supports dynamic configuration of chat conversations through URL query parameters. This feature allows you to initiate conversations with specific settings, models, and endpoints directly from the URL.
Chat Paths
Query parameters must follow a valid chat path:
- For new conversations:
/c/new? - For existing conversations:
/c/[conversation-id]?(where conversation-id is an existing one)
Examples:
https://your-domain.com/c/new?endpoint=ollama&model=llama3%3Alatest
https://your-domain.com/c/03debefd-6a50-438a-904d-1a806f82aad4?endpoint=openAI&model=o1-miniBasic Usage
The most common parameters to use are endpoint and model. Using both is recommended for the most predictable behavior:
https://your-domain.com/c/new?endpoint=azureOpenAI&model=o1-miniURL Encoding
Special characters in query params must be properly URL-encoded to work correctly. Common characters that need encoding:
:→%3A/→%2F?→%3F#→%23&→%26=→%3D+→%2B- Space →
%20(or+)
Example with special characters:
Original: `Write a function: def hello()`
Encoded: `/c/new?prompt=Write%20a%20function%3A%20def%20hello()`You can use JavaScript’s built-in encodeURIComponent() function to properly encode prompts:
const prompt = "Write a function: def hello()";
const encodedPrompt = encodeURIComponent(prompt);
const url = `/c/new?prompt=${encodedPrompt}`;
console.log(url);Try running the code in your browser console to see the encoded URL (browser shortcut: Ctrl+Shift+I).
Endpoint Selection
The endpoint parameter can be used alone:
https://your-domain.com/c/new?endpoint=googleWhen only endpoint is specified:
- It will use the last selected model from localStorage
- If no previous model exists, it will use the first available model in the endpoint’s model list
Notes
- The
endpointvalue must be one of the following:
openAI, azureOpenAI, google, anthropic, assistants, azureAssistants, bedrock, agents- If using a custom endpoint, you can use its name as the value (case-insensitive)
# using `endpoint=perplexity` for a custom endpoint named `Perplexity`
https://your-domain.com/c/new?endpoint=perplexity&model=llama-3.1-sonar-small-128k-onlineModel Selection
The model parameter can be used alone:
https://your-domain.com/c/new?model=gpt-4oWhen only model is specified:
- It will only select the model if it’s available in the current endpoint
- The current endpoint is either the default endpoint or the last selected endpoint
Prompt Parameter
The prompt parameter allows you to pre-populate the chat input field:
https://your-domain.com/c/new?prompt=Explain quantum computingYou can combine this with other parameters:
https://your-domain.com/c/new?endpoint=anthropic&model=claude-3-5-sonnet-20241022&prompt=Explain quantum computingSpecial Endpoints
Agents
You can directly load an agent using its ID without specifying the endpoint:
https://your-domain.com/c/new?agent_id=your-agent-idThis will automatically set the endpoint to agents.
Assistants
Similarly, you can load an assistant directly:
https://your-domain.com/c/new?assistant_id=your-assistant-idThis will automatically set the endpoint to assistants.
Supported Parameters
MyLinks supports a wide range of parameters for fine-tuning your conversation settings:
MyLinks Settings
maxContextTokens: Override the system-defined context windowresendFiles: Control file resubmission in subsequent messagespromptPrefix: Set custom instructions/system messageimageDetail: ‘low’, ‘auto’, or ‘high’ for image quality- Note: while this is a MyLinks-specific parameter, it only affects the following endpoints:
- OpenAI, Custom Endpoints, which are OpenAI-like, and Azure OpenAI, for which this defaults to ‘auto’
Model Parameters
Different endpoints support various parameters:
OpenAI, Custom, Azure OpenAI:
# Note: these should be valid numbers according to the provider's API
temperature, presence_penalty, frequency_penalty, stop, top_p, max_tokensGoogle, Anthropic:
# Note: these should be valid numbers according to the provider's API
topP, topK, maxOutputTokensAnthropic Specific:
Set this to true or false to toggle the “prompt-caching”:
promptCacheMore info: https://www.anthropic.com/news/prompt-caching
Bedrock:
# Bedrock region
region=us-west-2
# Bedrock equivalent of `max_tokens`
maxTokens=200Assistants/Azure Assistants:
# overrides existing assistant instructions for current run
instructions=your+instructions# Adds the current date and time to `additional_instructions` for each run.
append_current_datetime=trueMore Info
For more information on any of the above, refer to Model Spec Preset Fields, which shares most parameters.
Example with multiple parameters:
https://your-domain.com/c/new?endpoint=google&model=gemini-2.0-flash-exp&temperature=0.7&prompt=Oh hi mark⚠️ Warning
Exercise caution when using query parameters:
- Misuse or exceeding provider limits may result in API errors
- If you encounter bad request errors, reset the conversation by clicking “New Chat”
- Some parameters may have no effect if they’re not supported by the selected endpoint
Best Practices
- Always use both
endpointandmodelwhen possible - Verify parameter support for your chosen endpoint
- Use reasonable values within provider limits
- Test your parameter combinations before sharing URLs
Parameter Validation
All parameters are validated against MyLinks’s schema before being applied. Invalid parameters or values will be ignored, and valid settings will be applied to the conversation.
This feature enables powerful use cases like:
- Sharing specific conversation configurations
- Creating bookmarks for different chat settings
- Automating chat setup through URL parameters
#MyLinks #ChatConfiguration #AIParameters #OpenSource #URLQueryParameters