title
stringlengths
1
300
score
int64
0
8.54k
selftext
stringlengths
0
40k
created
timestamp[ns]
url
stringlengths
0
780
author
stringlengths
3
20
domain
stringlengths
0
82
edited
timestamp[ns]
gilded
int64
0
2
gildings
stringclasses
7 values
id
stringlengths
7
7
locked
bool
2 classes
media
stringlengths
646
1.8k
name
stringlengths
10
10
permalink
stringlengths
33
82
spoiler
bool
2 classes
stickied
bool
2 classes
thumbnail
stringlengths
4
213
ups
int64
0
8.54k
preview
stringlengths
301
5.01k
Not promoting, but any nerds around here?
1
[removed]
2025-02-08T23:11:30
https://www.reddit.com/r/LocalLLaMA/comments/1il0hfe/not_promoting_but_any_nerds_around_here/
Competitive-Pain7947
self.LocalLLaMA
1970-01-01T00:00:00
0
{}
1il0hfe
false
null
t3_1il0hfe
/r/LocalLLaMA/comments/1il0hfe/not_promoting_but_any_nerds_around_here/
false
false
self
1
null
What model to use for monthly expense sorting?
1
I have this mind-numbing task that I'd like an AI to do for me. With Claude and ChatGPT I hit their rate limits almost immediately with this though. Is this a use case that a local LLM would be better suited to? The task is to review all my monthly invoices from Amazon and other online vendors and assign them to expense categories I have defined. I have about a dozen such invoices every month, and each of them has between 1 and 10 items in it. By the way, Claude Haiku seemed too dumb for this task, but Opus and Sonnet did well. If this seems like something I should try with a local model, which one would you recommend?
2025-02-08T23:18:59
https://www.reddit.com/r/LocalLLaMA/comments/1il0ndn/what_model_to_use_for_monthly_expense_sorting/
otto_delmar
self.LocalLLaMA
1970-01-01T00:00:00
0
{}
1il0ndn
false
null
t3_1il0ndn
/r/LocalLLaMA/comments/1il0ndn/what_model_to_use_for_monthly_expense_sorting/
false
false
self
1
null
Not promoting, but any nerds around here?
1
[removed]
2025-02-08T23:20:07
https://www.reddit.com/r/LocalLLaMA/comments/1il0obd/not_promoting_but_any_nerds_around_here/
Justimrandy
self.LocalLLaMA
1970-01-01T00:00:00
0
{}
1il0obd
false
null
t3_1il0obd
/r/LocalLLaMA/comments/1il0obd/not_promoting_but_any_nerds_around_here/
false
false
self
1
null
Is there a good solution connecting local LLM to MCP?
1
I’m trying to figure if there is any client or other opensource solution to connect local LLM to MCP servers seamlessly. If I understand correctly, Claude uses something like prompt injection with call and result. But other models do not seem to support this as function calling is normally a one shot request, not multiple calls in one generation. Has anyone figured how to make this work with other models? I really tried to search but other than LibreChat nothing popped up. I would appreciate any guidance / keyword to search further.
2025-02-08T23:25:34
https://www.reddit.com/r/LocalLLaMA/comments/1il0skj/is_there_a_good_solution_connecting_local_llm_to/
Ok_Maize_3709
self.LocalLLaMA
1970-01-01T00:00:00
0
{}
1il0skj
false
null
t3_1il0skj
/r/LocalLLaMA/comments/1il0skj/is_there_a_good_solution_connecting_local_llm_to/
false
false
self
1
null
How Mistral, ChatGPT and DeepSeek handle sensitive topics
1
2025-02-08T23:46:11
https://v.redd.it/qkdra4wd50ie1
Touch105
v.redd.it
1970-01-01T00:00:00
0
{}
1il188r
false
{'reddit_video': {'bitrate_kbps': 5000, 'fallback_url': 'https://v.redd.it/qkdra4wd50ie1/DASH_1080.mp4?source=fallback', 'has_audio': True, 'height': 1080, 'width': 1080, 'scrubber_media_url': 'https://v.redd.it/qkdra4wd50ie1/DASH_96.mp4', 'dash_url': 'https://v.redd.it/qkdra4wd50ie1/DASHPlaylist.mpd?a=1741650383%2CYTU3ZTY2NzQ3YjRlMGY0NGQxMGE0MjU5MGQ2ZTU3Zjc3OTMyZmQ4MjlmNzk4ZTdjN2EyMGY5M2JmNTNmODExOA%3D%3D&v=1&f=sd', 'duration': 27, 'hls_url': 'https://v.redd.it/qkdra4wd50ie1/HLSPlaylist.m3u8?a=1741650383%2CNzcyNjFkODlmZWY4OWI3ZjI0ZTc1MjE2NmVmYTRlYWJkYWYwOTkyMmQxZmQxNWQzMzQzZTk1MmRlNWM5MmRmZA%3D%3D&v=1&f=sd', 'is_gif': False, 'transcoding_status': 'completed'}}
t3_1il188r
/r/LocalLLaMA/comments/1il188r/how_mistral_chatgpt_and_deepseek_handle_sensitive/
false
false
https://external-preview…faab0862f9aca413
1
{'images': [{'source': {'url': 'https://external-preview.redd.it/Mzl3Zm8xc2Q1MGllMa1zjpINaNlAwSPaIvApdB3LTGQP8LNtAbklarHk8uDT.png?format=pjpg&auto=webp&s=99798aa5e2623523f7f1710a57f0b66ff2fc9c4e', 'width': 1080, 'height': 1080}, 'resolutions': [{'url': 'https://external-preview.redd.it/Mzl3Zm8xc2Q1MGllMa1zjpINaNlAwSPaIvApdB3LTGQP8LNtAbklarHk8uDT.png?width=108&crop=smart&format=pjpg&auto=webp&s=e584590d14bccd818267baba366123b162665dba', 'width': 108, 'height': 108}, {'url': 'https://external-preview.redd.it/Mzl3Zm8xc2Q1MGllMa1zjpINaNlAwSPaIvApdB3LTGQP8LNtAbklarHk8uDT.png?width=216&crop=smart&format=pjpg&auto=webp&s=1cb4f0d93b981f4034d744818e6cb1953c354f66', 'width': 216, 'height': 216}, {'url': 'https://external-preview.redd.it/Mzl3Zm8xc2Q1MGllMa1zjpINaNlAwSPaIvApdB3LTGQP8LNtAbklarHk8uDT.png?width=320&crop=smart&format=pjpg&auto=webp&s=160c6fefe50d64a199773fd8b06f66abd66a7151', 'width': 320, 'height': 320}, {'url': 'https://external-preview.redd.it/Mzl3Zm8xc2Q1MGllMa1zjpINaNlAwSPaIvApdB3LTGQP8LNtAbklarHk8uDT.png?width=640&crop=smart&format=pjpg&auto=webp&s=d3d17a6d706e20d1752980f46dd8590785ebc064', 'width': 640, 'height': 640}, {'url': 'https://external-preview.redd.it/Mzl3Zm8xc2Q1MGllMa1zjpINaNlAwSPaIvApdB3LTGQP8LNtAbklarHk8uDT.png?width=960&crop=smart&format=pjpg&auto=webp&s=001be312bffea8f59c1efaa28d996d757cecd043', 'width': 960, 'height': 960}, {'url': 'https://external-preview.redd.it/Mzl3Zm8xc2Q1MGllMa1zjpINaNlAwSPaIvApdB3LTGQP8LNtAbklarHk8uDT.png?width=1080&crop=smart&format=pjpg&auto=webp&s=37cc787b1bd2d91f980d129fc36f7c698a056aef', 'width': 1080, 'height': 1080}], 'variants': {}, 'id': 'Mzl3Zm8xc2Q1MGllMa1zjpINaNlAwSPaIvApdB3LTGQP8LNtAbklarHk8uDT'}], 'enabled': False}
$150 for RTX 2070 XC Ultra
1
[removed]
2025-02-09T00:02:11
https://www.reddit.com/r/LocalLLaMA/comments/1il1kad/150_for_rtx_2070_xc_ultra/
TarunRaviYT
self.LocalLLaMA
1970-01-01T00:00:00
0
{}
1il1kad
false
null
t3_1il1kad
/r/LocalLLaMA/comments/1il1kad/150_for_rtx_2070_xc_ultra/
false
false
self
1
null