title
stringlengths 1
300
| score
int64 0
8.54k
| selftext
stringlengths 0
40k
| created
timestamp[ns]date 2023-04-01 04:30:41
2025-06-30 03:16:29
⌀ | url
stringlengths 0
878
| author
stringlengths 3
20
| domain
stringlengths 0
82
| edited
timestamp[ns]date 1970-01-01 00:00:00
2025-06-26 17:30:18
| gilded
int64 0
2
| gildings
stringclasses 7
values | id
stringlengths 7
7
| locked
bool 2
classes | media
stringlengths 646
1.8k
⌀ | name
stringlengths 10
10
| permalink
stringlengths 33
82
| spoiler
bool 2
classes | stickied
bool 2
classes | thumbnail
stringlengths 4
213
| ups
int64 0
8.54k
| preview
stringlengths 301
5.01k
⌀ |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
DeepSeek-v3 Function Calling when? | 1 | [removed] | 2025-01-27T21:25:16 | https://www.reddit.com/r/LocalLLaMA/comments/1ibkk0k/deepseekv3_function_calling_when/ | toolhouseai | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibkk0k | false | null | t3_1ibkk0k | /r/LocalLLaMA/comments/1ibkk0k/deepseekv3_function_calling_when/ | false | false | self | 1 | null |
Running DeepSeek API locally | 1 | [removed] | 2025-01-27T21:28:24 | https://www.reddit.com/r/LocalLLaMA/comments/1ibkmwb/running_deepseek_api_locally/ | Kakorama | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibkmwb | false | null | t3_1ibkmwb | /r/LocalLLaMA/comments/1ibkmwb/running_deepseek_api_locally/ | false | false | self | 1 | {'enabled': False, 'images': [{'id': 'fj84M-Z0L_zDI8VjgLPR-vGFwXVTTqVZFoa_h5offPs', 'resolutions': [{'height': 60, 'url': 'https://external-preview.redd.it/AO2sAF0_c_2mBe6UautksfrJRPPX3sFbs0Fu0kPn0C0.jpg?width=108&crop=smart&auto=webp&s=4f39a07c027d6036b98ac9f4ba405a8d11549aa3', 'width': 108}, {'height': 121, 'url': 'https://external-preview.redd.it/AO2sAF0_c_2mBe6UautksfrJRPPX3sFbs0Fu0kPn0C0.jpg?width=216&crop=smart&auto=webp&s=77d81d7dfb3f0dc0281915e155e87541e4069970', 'width': 216}, {'height': 180, 'url': 'https://external-preview.redd.it/AO2sAF0_c_2mBe6UautksfrJRPPX3sFbs0Fu0kPn0C0.jpg?width=320&crop=smart&auto=webp&s=e7e73cd0eb037665260b5368de787bf4d34a0086', 'width': 320}, {'height': 360, 'url': 'https://external-preview.redd.it/AO2sAF0_c_2mBe6UautksfrJRPPX3sFbs0Fu0kPn0C0.jpg?width=640&crop=smart&auto=webp&s=aa0a8cd368da789c05b75a810cf0a1e21413b8f2', 'width': 640}, {'height': 540, 'url': 'https://external-preview.redd.it/AO2sAF0_c_2mBe6UautksfrJRPPX3sFbs0Fu0kPn0C0.jpg?width=960&crop=smart&auto=webp&s=fb05999616d9a4f01271acab1427db387e6f4095', 'width': 960}, {'height': 607, 'url': 'https://external-preview.redd.it/AO2sAF0_c_2mBe6UautksfrJRPPX3sFbs0Fu0kPn0C0.jpg?width=1080&crop=smart&auto=webp&s=6aea590aabdd6f82e13381ed9c97788ecddef016', 'width': 1080}], 'source': {'height': 675, 'url': 'https://external-preview.redd.it/AO2sAF0_c_2mBe6UautksfrJRPPX3sFbs0Fu0kPn0C0.jpg?auto=webp&s=bb5327c204c8ce6c5773c7700d887e31427085b4', 'width': 1200}, 'variants': {}}]} |
How to preview HTML on LM Studio ? | 1 | [removed] | 2025-01-27T21:28:50 | https://www.reddit.com/r/LocalLLaMA/comments/1ibkn8v/how_to_preview_html_on_lm_studio/ | zakaghbal | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibkn8v | false | null | t3_1ibkn8v | /r/LocalLLaMA/comments/1ibkn8v/how_to_preview_html_on_lm_studio/ | false | false | self | 1 | null |
what local llm can review a JSON file and help generate a CSV file that provides a clear comparison of what the json has? | 1 | Hey everyone looking for a local LLm that can import a json file and export a csv file in organizing users | 2025-01-27T21:35:06 | https://www.reddit.com/r/LocalLLaMA/comments/1ibksw7/what_local_llm_can_review_a_json_file_and_help/ | CherryPlay | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibksw7 | false | null | t3_1ibksw7 | /r/LocalLLaMA/comments/1ibksw7/what_local_llm_can_review_a_json_file_and_help/ | false | false | self | 1 | null |
what local llm can review a JSON file and help generate a CSV file that provides a clear comparison of what the json has? | 1 | Hey everyone looking for a local LLm that can import a json file and export a csv file in organizing users | 2025-01-27T21:35:06 | https://www.reddit.com/r/LocalLLaMA/comments/1ibkswh/what_local_llm_can_review_a_json_file_and_help/ | CherryPlay | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibkswh | false | null | t3_1ibkswh | /r/LocalLLaMA/comments/1ibkswh/what_local_llm_can_review_a_json_file_and_help/ | false | false | self | 1 | null |
There's rumors that trump is being told to ban all nvda sales to China because of deepseek | 0 | China makes up like 20% of NVDAs revenue. Is this bearish for NVDA? | 2025-01-27T21:37:07 | https://www.reddit.com/r/LocalLLaMA/comments/1ibkuot/theres_rumors_that_trump_is_being_told_to_ban_all/ | bruhlmaocmonbro | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibkuot | false | null | t3_1ibkuot | /r/LocalLLaMA/comments/1ibkuot/theres_rumors_that_trump_is_being_told_to_ban_all/ | false | false | self | 0 | null |
1 Million Token Context Length 🔥 | 139 | 2025-01-27T21:41:21 | CelebrationClean7309 | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ibkydm | false | null | t3_1ibkydm | /r/LocalLLaMA/comments/1ibkydm/1_million_token_context_length/ | false | false | 139 | {'enabled': True, 'images': [{'id': 'MUdU6SLTpy11wtPCsW_ozCjMifku8NG1eI59b-I2AGA', 'resolutions': [{'height': 169, 'url': 'https://preview.redd.it/uicxkqj6wlfe1.png?width=108&crop=smart&auto=webp&s=57f793f8fd6df70b11384051e8196f6d09575e3f', 'width': 108}, {'height': 339, 'url': 'https://preview.redd.it/uicxkqj6wlfe1.png?width=216&crop=smart&auto=webp&s=2d0230dea24d9ed730fed5919c97ff78ad5e9a9e', 'width': 216}, {'height': 502, 'url': 'https://preview.redd.it/uicxkqj6wlfe1.png?width=320&crop=smart&auto=webp&s=058db1c9b21e9b10d77b51448e252bd7669dccd7', 'width': 320}, {'height': 1004, 'url': 'https://preview.redd.it/uicxkqj6wlfe1.png?width=640&crop=smart&auto=webp&s=701568383b2f300b0a9c905c9deac48d6fab5673', 'width': 640}, {'height': 1506, 'url': 'https://preview.redd.it/uicxkqj6wlfe1.png?width=960&crop=smart&auto=webp&s=1475a7340a56c20569470a37aac1c36b1f4501ee', 'width': 960}, {'height': 1695, 'url': 'https://preview.redd.it/uicxkqj6wlfe1.png?width=1080&crop=smart&auto=webp&s=6f9d73729ad2e9b88f8da91f51e36bbf63125f8a', 'width': 1080}], 'source': {'height': 1695, 'url': 'https://preview.redd.it/uicxkqj6wlfe1.png?auto=webp&s=ef9a14d1ba6656c80b24dd99322d160db8d05a91', 'width': 1080}, 'variants': {}}]} |
|||
Biased LLM Outputs, Tiananmen Square & Americanisations | 0 | 2025-01-27T21:41:54 | https://smcleod.net/2025/01/biased-llm-outputs-tiananmen-square-americanisations/ | sammcj | smcleod.net | 1970-01-01T00:00:00 | 0 | {} | 1ibkyvj | false | null | t3_1ibkyvj | /r/LocalLLaMA/comments/1ibkyvj/biased_llm_outputs_tiananmen_square/ | false | false | default | 0 | null |
|
Gyula Rabai: Deepseek Image Generation is now available | 8 | Deepseek just released it's open source image generation AI model. It looks better than Dall-E. | 2025-01-27T21:47:50 | https://www.facebook.com/share/p/19xRctDRDa/ | Outrageous-Win-3244 | facebook.com | 1970-01-01T00:00:00 | 0 | {} | 1ibl45y | false | null | t3_1ibl45y | /r/LocalLLaMA/comments/1ibl45y/gyula_rabai_deepseek_image_generation_is_now/ | false | false | default | 8 | null |
Testing DeepSeek on Your Datasets To Build Production-grade AI Application | 1 | [removed] | 2025-01-27T21:52:14 | https://www.reddit.com/r/LocalLLaMA/comments/1ibl7y4/testing_deepseek_on_your_datasets_to_build/ | 0xhbam | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibl7y4 | false | null | t3_1ibl7y4 | /r/LocalLLaMA/comments/1ibl7y4/testing_deepseek_on_your_datasets_to_build/ | false | false | self | 1 | null |
Newbie here - Can I deploy an offline model with ollama to a server and use it with my web app? | 5 |
I’ve been playing around with ollama and deepseek models on my own computer. I can send a prompt offline and get a response etc .
Is there a way to deploy this onto a server or host without needing an api key or token restriction and I can hook my web app onto that model ? | 2025-01-27T21:53:03 | https://www.reddit.com/r/LocalLLaMA/comments/1ibl8on/newbie_here_can_i_deploy_an_offline_model_with/ | ElectronicOriginal92 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibl8on | false | null | t3_1ibl8on | /r/LocalLLaMA/comments/1ibl8on/newbie_here_can_i_deploy_an_offline_model_with/ | false | false | self | 5 | null |
Testing DeepSeek on Your Datasets To Build Production-grade AI Application | 1 | [removed] | 2025-01-27T21:53:16 | https://www.reddit.com/r/LocalLLaMA/comments/1ibl8vc/testing_deepseek_on_your_datasets_to_build/ | 0xhbam | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibl8vc | false | null | t3_1ibl8vc | /r/LocalLLaMA/comments/1ibl8vc/testing_deepseek_on_your_datasets_to_build/ | false | false | self | 1 | null |
36 GB Macbook Pro M3 - What models to use for continue.dev local development? | 1 | [removed] | 2025-01-27T21:57:04 | https://www.reddit.com/r/LocalLLaMA/comments/1iblc6q/36_gb_macbook_pro_m3_what_models_to_use_for/ | Seaking7797 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1iblc6q | false | null | t3_1iblc6q | /r/LocalLLaMA/comments/1iblc6q/36_gb_macbook_pro_m3_what_models_to_use_for/ | false | false | self | 1 | {'enabled': False, 'images': [{'id': 'JoLAbcgPAn_D7ExuVvyaNJpSY81e3Jca27FTj1G8-xQ', 'resolutions': [{'height': 56, 'url': 'https://external-preview.redd.it/0Uyy7b4ONkY8vc27omtubWnHw_YkYeE8ieacWucbwkk.jpg?width=108&crop=smart&auto=webp&s=b6c70517bb80bca66bf94d99af93ec23982e2986', 'width': 108}, {'height': 113, 'url': 'https://external-preview.redd.it/0Uyy7b4ONkY8vc27omtubWnHw_YkYeE8ieacWucbwkk.jpg?width=216&crop=smart&auto=webp&s=146011169cd4033ebcd4b883efc62f0bd345d74b', 'width': 216}, {'height': 167, 'url': 'https://external-preview.redd.it/0Uyy7b4ONkY8vc27omtubWnHw_YkYeE8ieacWucbwkk.jpg?width=320&crop=smart&auto=webp&s=7a560fe31ff4e8b423a9029c052df232e0365572', 'width': 320}, {'height': 335, 'url': 'https://external-preview.redd.it/0Uyy7b4ONkY8vc27omtubWnHw_YkYeE8ieacWucbwkk.jpg?width=640&crop=smart&auto=webp&s=ea9ff85c4782247e303164d9d75b4071d789f397', 'width': 640}, {'height': 503, 'url': 'https://external-preview.redd.it/0Uyy7b4ONkY8vc27omtubWnHw_YkYeE8ieacWucbwkk.jpg?width=960&crop=smart&auto=webp&s=81aa9753e911761e0c56b3b897ba0f44cafff21d', 'width': 960}, {'height': 566, 'url': 'https://external-preview.redd.it/0Uyy7b4ONkY8vc27omtubWnHw_YkYeE8ieacWucbwkk.jpg?width=1080&crop=smart&auto=webp&s=a67fd0983e228aa2fa0a2ba466c071793fe21afc', 'width': 1080}], 'source': {'height': 1260, 'url': 'https://external-preview.redd.it/0Uyy7b4ONkY8vc27omtubWnHw_YkYeE8ieacWucbwkk.jpg?auto=webp&s=92948afd26cc637bb25c79223a1b99b3ecbbbfa2', 'width': 2401}, 'variants': {}}]} |
OpenAI reaction to Deepseek | 540 | 2025-01-27T21:58:37 | https://v.redd.it/rf7tj9l9zlfe1 | No_Palpitation7740 | /r/LocalLLaMA/comments/1ibldkl/openai_reaction_to_deepseek/ | 1970-01-01T00:00:00 | 0 | {} | 1ibldkl | false | {'reddit_video': {'bitrate_kbps': 2400, 'dash_url': 'https://v.redd.it/rf7tj9l9zlfe1/DASHPlaylist.mpd?a=1740736735%2CNjU0YWVlODgyNWE1MTRjNGUzMzdlZjdiODhlMTZmZWY5NGU5Mjk5YjdmZjY5OWY5ZmI3ZWMxZjFkYzI1ZjBiYQ%3D%3D&v=1&f=sd', 'duration': 122, 'fallback_url': 'https://v.redd.it/rf7tj9l9zlfe1/DASH_720.mp4?source=fallback', 'has_audio': True, 'height': 720, 'hls_url': 'https://v.redd.it/rf7tj9l9zlfe1/HLSPlaylist.m3u8?a=1740736735%2CYjQ0MTAyMWU1OWE0NTYyODE4YzQ1YTgyNWNmZmNkYzFmMjY4NDYyZjE4ZjM3M2JhYzg2MmU0YTM1NDhlZDBkZg%3D%3D&v=1&f=sd', 'is_gif': False, 'scrubber_media_url': 'https://v.redd.it/rf7tj9l9zlfe1/DASH_96.mp4', 'transcoding_status': 'completed', 'width': 1280}} | t3_1ibldkl | /r/LocalLLaMA/comments/1ibldkl/openai_reaction_to_deepseek/ | false | false | 540 | {'enabled': False, 'images': [{'id': 'aHF1eWU4Zzl6bGZlMf4MrbzIQ8hanARSUc7k95AH77zTyfar1-pm8LqZJoap', 'resolutions': [{'height': 60, 'url': 'https://external-preview.redd.it/aHF1eWU4Zzl6bGZlMf4MrbzIQ8hanARSUc7k95AH77zTyfar1-pm8LqZJoap.png?width=108&crop=smart&format=pjpg&auto=webp&s=b008ecb63162b7fc4244b9429530bdc3bdd1e08a', 'width': 108}, {'height': 121, 'url': 'https://external-preview.redd.it/aHF1eWU4Zzl6bGZlMf4MrbzIQ8hanARSUc7k95AH77zTyfar1-pm8LqZJoap.png?width=216&crop=smart&format=pjpg&auto=webp&s=1163ca90fe2097aa49a4a70a48fee9708842d5ea', 'width': 216}, {'height': 180, 'url': 'https://external-preview.redd.it/aHF1eWU4Zzl6bGZlMf4MrbzIQ8hanARSUc7k95AH77zTyfar1-pm8LqZJoap.png?width=320&crop=smart&format=pjpg&auto=webp&s=ec885283adbc7574263ba6ef2231ab9e0e051c51', 'width': 320}, {'height': 360, 'url': 'https://external-preview.redd.it/aHF1eWU4Zzl6bGZlMf4MrbzIQ8hanARSUc7k95AH77zTyfar1-pm8LqZJoap.png?width=640&crop=smart&format=pjpg&auto=webp&s=85b6ae97617a033c8d17a318ba1216b0326e6244', 'width': 640}, {'height': 540, 'url': 'https://external-preview.redd.it/aHF1eWU4Zzl6bGZlMf4MrbzIQ8hanARSUc7k95AH77zTyfar1-pm8LqZJoap.png?width=960&crop=smart&format=pjpg&auto=webp&s=6a638e039a55be9c16948fbc198f7ebbc199b429', 'width': 960}, {'height': 607, 'url': 'https://external-preview.redd.it/aHF1eWU4Zzl6bGZlMf4MrbzIQ8hanARSUc7k95AH77zTyfar1-pm8LqZJoap.png?width=1080&crop=smart&format=pjpg&auto=webp&s=e881030da97e1fdf4fbb13492caf1b6ada42e1c3', 'width': 1080}], 'source': {'height': 720, 'url': 'https://external-preview.redd.it/aHF1eWU4Zzl6bGZlMf4MrbzIQ8hanARSUc7k95AH77zTyfar1-pm8LqZJoap.png?format=pjpg&auto=webp&s=ddd67206d96bd346a7e9822cf37d788291101533', 'width': 1280}, 'variants': {}}]} |
||
The current state of the world | 1 | [removed] | 2025-01-27T22:01:10 | [deleted] | 1970-01-01T00:00:00 | 0 | {} | 1iblfz3 | false | null | t3_1iblfz3 | /r/LocalLLaMA/comments/1iblfz3/the_current_state_of_the_world/ | false | false | default | 1 | null |
||
Fine-tuning llama 3B master's thesis ideas? | 1 | [removed] | 2025-01-27T22:07:06 | https://www.reddit.com/r/LocalLLaMA/comments/1iblley/finetuning_llama_3b_masters_thesis_ideas/ | kebab_bitch | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1iblley | false | null | t3_1iblley | /r/LocalLLaMA/comments/1iblley/finetuning_llama_3b_masters_thesis_ideas/ | false | false | self | 1 | null |
API is still down. Website says large-scale malicious attacks on DeepSeek services. What is going on? | 7 | 2025-01-27T22:10:23 | https://www.reddit.com/r/LocalLLaMA/comments/1iblobt/api_is_still_down_website_says_largescale/ | LorestForest | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1iblobt | false | null | t3_1iblobt | /r/LocalLLaMA/comments/1iblobt/api_is_still_down_website_says_largescale/ | false | false | 7 | null |
||
How do I utilize LM Studio remotely via Safari or another ios app? | 4 | Howdy, I have my Windows PC and Iphone connected through Tailscale and was able to remotely use LM Studio through Nomachine, but the UI is rather painful. Is there a simple text-friendly way to interact with my PC's model through safari or some appropriate ios app? Searching here and online hasn't yielded much help for this struggling amateur. Any advice is appreciated, thanks! | 2025-01-27T22:20:18 | https://www.reddit.com/r/LocalLLaMA/comments/1iblwys/how_do_i_utilize_lm_studio_remotely_via_safari_or/ | nzuy | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1iblwys | false | null | t3_1iblwys | /r/LocalLLaMA/comments/1iblwys/how_do_i_utilize_lm_studio_remotely_via_safari_or/ | false | false | self | 4 | null |
So DeepSeek is cool, but it doesn’t end the AI Race | 3 | So because it’s open sourced, and given it took reportedly $6 M to train, a bunch of large tech companies or mid sized companies have a clear path to “train their own reasoning model”. Yes it’s impressive and shows that the hype that only certain companies can do this. Also the $500 B investment is extremely excessive. So while I like DeepSeek, it’s clear that the population already sees the issues that the body of data is not disclosed, has a bias towards the Chinese government, API is not safe for IP or PHI, etc.
The next type of model is on the horizon, tech companies are now just going to pivot and “fixl the issues that surround DeepSeeks data and bias (haha with their own Bias but whatever) | 2025-01-27T22:28:53 | https://www.reddit.com/r/LocalLLaMA/comments/1ibm48u/so_deepseek_is_cool_but_it_doesnt_end_the_ai_race/ | bunoso | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibm48u | false | null | t3_1ibm48u | /r/LocalLLaMA/comments/1ibm48u/so_deepseek_is_cool_but_it_doesnt_end_the_ai_race/ | false | false | self | 3 | null |
Jack Dorsey: “open source everything” | 1 | 2025-01-27T22:29:24 | bruhlmaocmonbro | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ibm4pe | false | null | t3_1ibm4pe | /r/LocalLLaMA/comments/1ibm4pe/jack_dorsey_open_source_everything/ | false | false | 1 | {'enabled': True, 'images': [{'id': 'PTPicwPa8UYKrzT7V9PDPGmz67lfEKlRAB1GAQ8iXb4', 'resolutions': [{'height': 83, 'url': 'https://preview.redd.it/bqwd81zo4mfe1.jpeg?width=108&crop=smart&auto=webp&s=592a918a2c0db4abfc6065ba17f2a81f1094bf22', 'width': 108}, {'height': 166, 'url': 'https://preview.redd.it/bqwd81zo4mfe1.jpeg?width=216&crop=smart&auto=webp&s=5188c4deaa30f576e28a0c74c655ff627d779732', 'width': 216}, {'height': 246, 'url': 'https://preview.redd.it/bqwd81zo4mfe1.jpeg?width=320&crop=smart&auto=webp&s=a3a5a393babfb0a6c0ffddd6e329796bebecb0d8', 'width': 320}, {'height': 493, 'url': 'https://preview.redd.it/bqwd81zo4mfe1.jpeg?width=640&crop=smart&auto=webp&s=4019db03718ff614f80111b66e7ccbbe70fe47e0', 'width': 640}, {'height': 740, 'url': 'https://preview.redd.it/bqwd81zo4mfe1.jpeg?width=960&crop=smart&auto=webp&s=45779e4cd0fdbb48b9f7f81d48d93650ed2c36a0', 'width': 960}, {'height': 832, 'url': 'https://preview.redd.it/bqwd81zo4mfe1.jpeg?width=1080&crop=smart&auto=webp&s=cae21824d49ae39cc11723a8c111543ba6707639', 'width': 1080}], 'source': {'height': 902, 'url': 'https://preview.redd.it/bqwd81zo4mfe1.jpeg?auto=webp&s=cbe58f206d3f1f2e9e3457b2bed2dba68ed765d8', 'width': 1170}, 'variants': {}}]} |
|||
How to use QWQ 32b preview in LM Studio | 1 | Hi team. I tried loading QWQ 33b preview in LM Studio and while it works it doesn’t appear to be performing any reasoning steps. Is this a limitation of LM studio or is there something else I need to do to make it work (or do I just not understand how it is supposed to work)?
Thanks. | 2025-01-27T22:29:45 | https://www.reddit.com/r/LocalLLaMA/comments/1ibm50k/how_to_use_qwq_32b_preview_in_lm_studio/ | bigfamreddit | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibm50k | false | null | t3_1ibm50k | /r/LocalLLaMA/comments/1ibm50k/how_to_use_qwq_32b_preview_in_lm_studio/ | false | false | self | 1 | null |
How can we be so sure the training of Deepseek R1 is around $6 million? | 159 | I heard there parent company is a quant fund that may be one of the the contributors that slashed the NVDA price today.
Besides this, how do we estimate this is possible? Since the model does not include training dataset, is there a way for any organizations to do an estimation about it? Alex Wang said they have at least 50k H100, maybe more, and NVDA sold 20% of H100 to Singapore last year, which most of the cards could be used by Chinese companies.
What if today's NVDA price is just a sophisticated plot to make money for their quant fund? | 2025-01-27T22:30:42 | https://www.reddit.com/r/LocalLLaMA/comments/1ibm5u3/how_can_we_be_so_sure_the_training_of_deepseek_r1/ | scmlfty | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibm5u3 | false | null | t3_1ibm5u3 | /r/LocalLLaMA/comments/1ibm5u3/how_can_we_be_so_sure_the_training_of_deepseek_r1/ | false | false | self | 159 | null |
AMD ZenDNN 5.0 Software For AI Delivers "400% Performance Uplift" | 9 | AMD | 2025-01-27T22:30:45 | https://www.phoronix.com/news/AMD-ZenDNN-5.0-400p-Performance | Rachados22x2 | phoronix.com | 1970-01-01T00:00:00 | 0 | {} | 1ibm5vr | false | null | t3_1ibm5vr | /r/LocalLLaMA/comments/1ibm5vr/amd_zendnn_50_software_for_ai_delivers_400/ | false | false | 9 | {'enabled': False, 'images': [{'id': 'CdgAQWAPckE_zkMCUYzXvZhHdnkVHRTxvoX1ahKdmHE', 'resolutions': [{'height': 27, 'url': 'https://external-preview.redd.it/X4Nrm7GMhyyx1lgfyiry6w1fuvijmProRZbmXbSTDKk.jpg?width=108&crop=smart&auto=webp&s=35b0dc4906ba26416057e5ce4821f3664647819d', 'width': 108}, {'height': 54, 'url': 'https://external-preview.redd.it/X4Nrm7GMhyyx1lgfyiry6w1fuvijmProRZbmXbSTDKk.jpg?width=216&crop=smart&auto=webp&s=bd28f9820d2d04e7ace448582169a45dc81e2739', 'width': 216}, {'height': 80, 'url': 'https://external-preview.redd.it/X4Nrm7GMhyyx1lgfyiry6w1fuvijmProRZbmXbSTDKk.jpg?width=320&crop=smart&auto=webp&s=e967a17dfce4412274d6b5a356c2e37c6d64a328', 'width': 320}], 'source': {'height': 132, 'url': 'https://external-preview.redd.it/X4Nrm7GMhyyx1lgfyiry6w1fuvijmProRZbmXbSTDKk.jpg?auto=webp&s=0649d9487d4d7ebd074af4cfc3ec00e3be06b786', 'width': 522}, 'variants': {}}]} |
|
Could it be that Ai corps don't really have a clue on what to do after scaling Reinforcement learning COT? | 5 | O1 came out 4.5 months ago, all we have from open AI in terms of an actually better model is a scaled up version of O1's thinking process, which costs 100x more.
Same situation with Google and their Ai thinking model, just RL based COT.
Right now, it seems the biggest innovation in the AI space happened 4.5 months ago.
I believe this is why Deepseek was able to catch up,
I don't think there is any new paradigm that the AI companies have hidden away, so imitating the results of O1, and R1 is just a matter of GPU costs, I think this is also why deepseek made their model open source, as it wasn't really a huge research investment.
| 2025-01-27T22:32:28 | https://www.reddit.com/r/LocalLLaMA/comments/1ibm7cy/could_it_be_that_ai_corps_dont_really_have_a_clue/ | Ill_Distribution8517 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibm7cy | false | null | t3_1ibm7cy | /r/LocalLLaMA/comments/1ibm7cy/could_it_be_that_ai_corps_dont_really_have_a_clue/ | false | false | self | 5 | null |
I can't believe ChatGPT lost its job to AI | 137 | 2025-01-27T22:34:58 | robertpiosik | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ibm9hv | false | null | t3_1ibm9hv | /r/LocalLLaMA/comments/1ibm9hv/i_cant_believe_chatgpt_lost_its_job_to_ai/ | false | false | 137 | {'enabled': True, 'images': [{'id': 'ObgzqB7CdtRv-_86Zw4JpKNes2truO18t2PStQTvysQ', 'resolutions': [{'height': 20, 'url': 'https://preview.redd.it/x3khqm4n5mfe1.png?width=108&crop=smart&auto=webp&s=948a833e165cadaed57897ff9b4d499f9c0d5859', 'width': 108}, {'height': 40, 'url': 'https://preview.redd.it/x3khqm4n5mfe1.png?width=216&crop=smart&auto=webp&s=7194e43b8cd9cb88d897970b48ca49496175121b', 'width': 216}, {'height': 60, 'url': 'https://preview.redd.it/x3khqm4n5mfe1.png?width=320&crop=smart&auto=webp&s=91590e56c6affcaa915414cb4bebf854bc74f6b9', 'width': 320}, {'height': 120, 'url': 'https://preview.redd.it/x3khqm4n5mfe1.png?width=640&crop=smart&auto=webp&s=f80741025eb0938532574e739fdab63bb7c7c5f0', 'width': 640}, {'height': 180, 'url': 'https://preview.redd.it/x3khqm4n5mfe1.png?width=960&crop=smart&auto=webp&s=d1189dfe620d4cae86dd58a82e575e4739fc71fe', 'width': 960}, {'height': 203, 'url': 'https://preview.redd.it/x3khqm4n5mfe1.png?width=1080&crop=smart&auto=webp&s=8588d905d654da1012d55ca8ee427c39e323277b', 'width': 1080}], 'source': {'height': 229, 'url': 'https://preview.redd.it/x3khqm4n5mfe1.png?auto=webp&s=872153fe277ed75e834329e92c46e324ace99717', 'width': 1216}, 'variants': {}}]} |
|||
[v1.0.8] Notate - New Reasoning Layer Added (Agent actions will be added) | UI/UX Improvements & Deepseek Integration - Have suggstions? love to hear em! | 1 | 2025-01-27T22:41:02 | https://github.com/CNTRLAI/Notate | Hairetsu | github.com | 1970-01-01T00:00:00 | 0 | {} | 1ibmejn | false | null | t3_1ibmejn | /r/LocalLLaMA/comments/1ibmejn/v108_notate_new_reasoning_layer_added_agent/ | false | false | 1 | {'enabled': False, 'images': [{'id': 'N74oGM5ga291vG6ue9GeKiVdKbeo5ITz2TUAczqs7yQ', 'resolutions': [{'height': 54, 'url': 'https://external-preview.redd.it/mLmb6PM3DcPIkGqR7oQ2EX6QH1e3GfDbIdNH1M_vKQw.jpg?width=108&crop=smart&auto=webp&s=d99db60cf026bc83907eed0efa13f49907204d86', 'width': 108}, {'height': 108, 'url': 'https://external-preview.redd.it/mLmb6PM3DcPIkGqR7oQ2EX6QH1e3GfDbIdNH1M_vKQw.jpg?width=216&crop=smart&auto=webp&s=073524e26735d41a9cfd96e75782543c5d612046', 'width': 216}, {'height': 160, 'url': 'https://external-preview.redd.it/mLmb6PM3DcPIkGqR7oQ2EX6QH1e3GfDbIdNH1M_vKQw.jpg?width=320&crop=smart&auto=webp&s=afa31eb6751bfdd92d04700feaf9e6f9531194ef', 'width': 320}, {'height': 320, 'url': 'https://external-preview.redd.it/mLmb6PM3DcPIkGqR7oQ2EX6QH1e3GfDbIdNH1M_vKQw.jpg?width=640&crop=smart&auto=webp&s=769808d36520d2e157bfc35efdd17fa7406c869f', 'width': 640}, {'height': 480, 'url': 'https://external-preview.redd.it/mLmb6PM3DcPIkGqR7oQ2EX6QH1e3GfDbIdNH1M_vKQw.jpg?width=960&crop=smart&auto=webp&s=139f014109c32debe74aaa2cccc300c86b8c52ae', 'width': 960}, {'height': 540, 'url': 'https://external-preview.redd.it/mLmb6PM3DcPIkGqR7oQ2EX6QH1e3GfDbIdNH1M_vKQw.jpg?width=1080&crop=smart&auto=webp&s=76529e3e9893b7cafa8f121f48e5dc7e886558cb', 'width': 1080}], 'source': {'height': 600, 'url': 'https://external-preview.redd.it/mLmb6PM3DcPIkGqR7oQ2EX6QH1e3GfDbIdNH1M_vKQw.jpg?auto=webp&s=a2477a2317cf79c8c7a34652da462b429dfb6212', 'width': 1200}, 'variants': {}}]} |
||
Deepseek censorship is more tolerable than Western censorship | 642 | Sorry, but I just had to say it.
I can't speak for non westerners
but as a U.S citizen I find that
Deepseek is able to handle
"sensitive topics" much better
than SOTA models made here
in America. I hate when people
claim that Deepseeks censorship is worse than ours when in reality we both suffer from some kind of censorship. And please dont say "well Deepseek is controlled
by CCP and stare sponsored
cenorship", because I don't really care, it doesn't effect me. Anyways, another win for Deepseek!! | 2025-01-27T22:42:20 | https://www.reddit.com/r/LocalLLaMA/comments/1ibmflv/deepseek_censorship_is_more_tolerable_than/ | CreepyMan121 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibmflv | false | null | t3_1ibmflv | /r/LocalLLaMA/comments/1ibmflv/deepseek_censorship_is_more_tolerable_than/ | false | false | self | 642 | null |
What are the chances deepseek gets banned? | 0 | Just watch the GOP press conference and they said it’s a threat to the U.S. economy and trump will do something about it. | 2025-01-27T22:42:45 | https://www.reddit.com/r/LocalLLaMA/comments/1ibmfxq/what_are_the_chances_deepseek_gets_banned/ | bruhlmaocmonbro | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibmfxq | false | null | t3_1ibmfxq | /r/LocalLLaMA/comments/1ibmfxq/what_are_the_chances_deepseek_gets_banned/ | false | false | self | 0 | null |
Looking for a Seamless AI Companion Like Jarvis – Any Recommendations? | 1 | [removed] | 2025-01-27T22:45:01 | https://www.reddit.com/r/LocalLLaMA/comments/1ibmhsk/looking_for_a_seamless_ai_companion_like_jarvis/ | Fantastic-Air8513 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibmhsk | false | null | t3_1ibmhsk | /r/LocalLLaMA/comments/1ibmhsk/looking_for_a_seamless_ai_companion_like_jarvis/ | false | false | self | 1 | null |
GFX 1030: AMD 6800XT with Koboldcpp finally letting use my gpu on Windows | 1 | [removed] | 2025-01-27T22:49:26 | https://www.reddit.com/r/LocalLLaMA/comments/1ibmlhi/gfx_1030_amd_6800xt_with_koboldcpp_finally/ | Wreid23 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibmlhi | false | null | t3_1ibmlhi | /r/LocalLLaMA/comments/1ibmlhi/gfx_1030_amd_6800xt_with_koboldcpp_finally/ | false | false | self | 1 | {'enabled': False, 'images': [{'id': 'NQ3NlOdcxW1IXIHt-3LXV9cM98aGZsVmUgbOO1LEnfA', 'resolutions': [{'height': 48, 'url': 'https://external-preview.redd.it/Z-lIEaViCWhay1-E-lyAlnmYfmUW2za5oAUOAQEJE1Q.jpg?width=108&crop=smart&auto=webp&s=dde2647b85c1fdc23d73996acbcae9c0775a766e', 'width': 108}, {'height': 97, 'url': 'https://external-preview.redd.it/Z-lIEaViCWhay1-E-lyAlnmYfmUW2za5oAUOAQEJE1Q.jpg?width=216&crop=smart&auto=webp&s=175e2c398e7553bc60719d615e861dc70b6abf88', 'width': 216}, {'height': 143, 'url': 'https://external-preview.redd.it/Z-lIEaViCWhay1-E-lyAlnmYfmUW2za5oAUOAQEJE1Q.jpg?width=320&crop=smart&auto=webp&s=b2cec9e80841ebc12b45cd4600db07f9b9a169eb', 'width': 320}, {'height': 287, 'url': 'https://external-preview.redd.it/Z-lIEaViCWhay1-E-lyAlnmYfmUW2za5oAUOAQEJE1Q.jpg?width=640&crop=smart&auto=webp&s=e4c7f3def61826fabcabfbf68cb7b732f6a06966', 'width': 640}, {'height': 431, 'url': 'https://external-preview.redd.it/Z-lIEaViCWhay1-E-lyAlnmYfmUW2za5oAUOAQEJE1Q.jpg?width=960&crop=smart&auto=webp&s=56478df9c3760989b0a7fc43400016460d3dbc93', 'width': 960}, {'height': 485, 'url': 'https://external-preview.redd.it/Z-lIEaViCWhay1-E-lyAlnmYfmUW2za5oAUOAQEJE1Q.jpg?width=1080&crop=smart&auto=webp&s=f42e4ed75883d3812fe139efde9a3a619362a6bf', 'width': 1080}], 'source': {'height': 539, 'url': 'https://external-preview.redd.it/Z-lIEaViCWhay1-E-lyAlnmYfmUW2za5oAUOAQEJE1Q.jpg?auto=webp&s=9b69bef53c675318db2e401b84de569193f23a09', 'width': 1200}, 'variants': {}}]} |
You can win | 1 | [removed] | 2025-01-27T22:49:58 | napalix | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ibmlyc | false | null | t3_1ibmlyc | /r/LocalLLaMA/comments/1ibmlyc/you_can_win/ | false | false | 1 | {'enabled': True, 'images': [{'id': 'EPQfBfYTPo4Db9cGg56krV6YO3hwq-IVjDYbRWV67C4', 'resolutions': [{'height': 202, 'url': 'https://preview.redd.it/kx0jhdgf8mfe1.jpeg?width=108&crop=smart&auto=webp&s=ca25014e9a431dea0ce5214f4abf88605aa2fb2a', 'width': 108}, {'height': 405, 'url': 'https://preview.redd.it/kx0jhdgf8mfe1.jpeg?width=216&crop=smart&auto=webp&s=a45696edcbf44dc7ad09794582e859aa7f6f782a', 'width': 216}, {'height': 601, 'url': 'https://preview.redd.it/kx0jhdgf8mfe1.jpeg?width=320&crop=smart&auto=webp&s=613575c7129832fc2f401362bb682ea803f54424', 'width': 320}, {'height': 1202, 'url': 'https://preview.redd.it/kx0jhdgf8mfe1.jpeg?width=640&crop=smart&auto=webp&s=53c76f10a68887baf0a00694990a53fe1c2dfe8c', 'width': 640}], 'source': {'height': 1556, 'url': 'https://preview.redd.it/kx0jhdgf8mfe1.jpeg?auto=webp&s=4631ea3dfce232aad763dac8cadf97a936e3af86', 'width': 828}, 'variants': {}}]} |
||
Deepseek API does not work today. | 6 | Unfortunately, today Deepseek API does not work. I use it directly from https://www.deepseek.com/. I paid some money there, my balance is positive. It worked during weekend and suddenly today it does not work. | 2025-01-27T22:50:26 | https://www.reddit.com/r/LocalLLaMA/comments/1ibmmc8/deepseek_api_does_not_work_today/ | AnalystAI | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibmmc8 | false | null | t3_1ibmmc8 | /r/LocalLLaMA/comments/1ibmmc8/deepseek_api_does_not_work_today/ | false | false | self | 6 | null |
You can win | 1 | 2025-01-27T22:55:14 | napalix | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ibmqfq | false | null | t3_1ibmqfq | /r/LocalLLaMA/comments/1ibmqfq/you_can_win/ | false | false | 1 | {'enabled': True, 'images': [{'id': 'WscjEif9ordRsV1tzJ534X0fj-Vw5yLxWeOmaH89SLA', 'resolutions': [{'height': 202, 'url': 'https://preview.redd.it/frp6yg8d9mfe1.jpeg?width=108&crop=smart&auto=webp&s=4db8036ab8d713957a32ce7513f78e7fc6d55700', 'width': 108}, {'height': 405, 'url': 'https://preview.redd.it/frp6yg8d9mfe1.jpeg?width=216&crop=smart&auto=webp&s=ccec61a585e337cdc37cc4a0edbc3e83e06ce799', 'width': 216}, {'height': 601, 'url': 'https://preview.redd.it/frp6yg8d9mfe1.jpeg?width=320&crop=smart&auto=webp&s=8ffe4b377c9eebfe9bd17b73a5670a1b6b7d5b7f', 'width': 320}, {'height': 1202, 'url': 'https://preview.redd.it/frp6yg8d9mfe1.jpeg?width=640&crop=smart&auto=webp&s=bc3adcd609a448e592eddf98b5d5d459b06e10b2', 'width': 640}], 'source': {'height': 1556, 'url': 'https://preview.redd.it/frp6yg8d9mfe1.jpeg?auto=webp&s=bf2c89b7d98eb8261da5ddbb3befe722dff601d7', 'width': 828}, 'variants': {}}]} |
|||
Good way of comparing robustness between R1 and its distills: division accuracy | 98 | Source: https://x.com/TheXeophon/status/1883933054366015545
This shows that despite looking good on benchmarks (and being pretty good overall) the distilled versions are not nearly as robust as a model trained with actual rl (please ignore the fact a calculator would ace this).
The distills would almost certainly perform a lot better and more robustly if you did rl on them instead of just sft even if benchmarks stayed mostly the same. | 2025-01-27T22:57:37 | PC_Screen | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ibmser | false | null | t3_1ibmser | /r/LocalLLaMA/comments/1ibmser/good_way_of_comparing_robustness_between_r1_and/ | false | false | 98 | {'enabled': True, 'images': [{'id': 'dpxPTmChyDL8-DQTYi3ARocnqKHRx0IweM3XcU_U0sY', 'resolutions': [{'height': 168, 'url': 'https://preview.redd.it/49ld6dks9mfe1.png?width=108&crop=smart&auto=webp&s=a19599e807ca179d3ffef9b1b66d7ca3ed21ada4', 'width': 108}, {'height': 337, 'url': 'https://preview.redd.it/49ld6dks9mfe1.png?width=216&crop=smart&auto=webp&s=e83369171487fa08d8b5de15429da2c42f392dbe', 'width': 216}, {'height': 499, 'url': 'https://preview.redd.it/49ld6dks9mfe1.png?width=320&crop=smart&auto=webp&s=385e9d1d75c09ef95faece9d812e3514db436bc6', 'width': 320}, {'height': 998, 'url': 'https://preview.redd.it/49ld6dks9mfe1.png?width=640&crop=smart&auto=webp&s=e2b4259f807f8d639b6383915783d0e37d2c8a0e', 'width': 640}], 'source': {'height': 1061, 'url': 'https://preview.redd.it/49ld6dks9mfe1.png?auto=webp&s=330fb7c0ec12569c05452e7cf66713b056c2dca4', 'width': 680}, 'variants': {}}]} |
||
If DeepSeek is so fast, why is it experiencing high traffic? | 1 | [removed] | 2025-01-27T22:57:40 | Primary-Medium-895 | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ibmsg0 | false | null | t3_1ibmsg0 | /r/LocalLLaMA/comments/1ibmsg0/if_deepseek_is_so_fast_why_is_it_experiencing/ | false | false | 1 | {'enabled': True, 'images': [{'id': 'AE-K7Quav2KBYiuABI4cCbtb1DWJ3qRQK0WJ-Z3ZW1A', 'resolutions': [{'height': 102, 'url': 'https://preview.redd.it/gn0kinrs9mfe1.jpeg?width=108&crop=smart&auto=webp&s=2cdd1ac25660fd348092991042696f8ecf9c7e9e', 'width': 108}, {'height': 205, 'url': 'https://preview.redd.it/gn0kinrs9mfe1.jpeg?width=216&crop=smart&auto=webp&s=d50d0410f5127c36e98ebab6917069d9d74db5d6', 'width': 216}, {'height': 304, 'url': 'https://preview.redd.it/gn0kinrs9mfe1.jpeg?width=320&crop=smart&auto=webp&s=61935dd0398704f788e8715311e25b19a35fe3c5', 'width': 320}], 'source': {'height': 499, 'url': 'https://preview.redd.it/gn0kinrs9mfe1.jpeg?auto=webp&s=5973ea4f9cec59f2ddebf4d283c81070366cf845', 'width': 524}, 'variants': {}}]} |
||
Just canceled my OpenAI Plus subscription (for now). Been running DeepSeek-R1 14b locally on my home workstation. I'll probably renew it if OpenAI launches something worthy for Plus tier by then. | 505 | 2025-01-27T22:59:53 | CarbonTail | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ibmubo | false | null | t3_1ibmubo | /r/LocalLLaMA/comments/1ibmubo/just_canceled_my_openai_plus_subscription_for_now/ | false | false | 505 | {'enabled': True, 'images': [{'id': '8IhE3zd8Nxom8Prg-YcgyvQdToU1GXpEiNA9F0JIYkI', 'resolutions': [{'height': 95, 'url': 'https://preview.redd.it/m45p2d9t9mfe1.png?width=108&crop=smart&auto=webp&s=000264bec423d312b701b8c11b18227fc424a617', 'width': 108}, {'height': 190, 'url': 'https://preview.redd.it/m45p2d9t9mfe1.png?width=216&crop=smart&auto=webp&s=954ed421bc3077aa36163ce9bc7a7ffc98f00ae0', 'width': 216}, {'height': 282, 'url': 'https://preview.redd.it/m45p2d9t9mfe1.png?width=320&crop=smart&auto=webp&s=3dd7d49365878c315a44b3d7af4cf602054ce843', 'width': 320}, {'height': 564, 'url': 'https://preview.redd.it/m45p2d9t9mfe1.png?width=640&crop=smart&auto=webp&s=1d2ca8fc18451f5e82d99e58f0fde7474c05578c', 'width': 640}], 'source': {'height': 613, 'url': 'https://preview.redd.it/m45p2d9t9mfe1.png?auto=webp&s=9923ae47df0b359c6ff96abef9635a2bbe889085', 'width': 695}, 'variants': {}}]} |
|||
Getting started with Local LLaMA – any tips for a beginner? | 1 | [removed] | 2025-01-27T23:05:32 | https://www.reddit.com/r/LocalLLaMA/comments/1ibmz9a/getting_started_with_local_llama_any_tips_for_a/ | Acceptable-Cheek5099 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibmz9a | false | null | t3_1ibmz9a | /r/LocalLLaMA/comments/1ibmz9a/getting_started_with_local_llama_any_tips_for_a/ | false | false | self | 1 | null |
confusion about deepseek ai | 1 | [removed] | 2025-01-27T23:09:26 | https://www.reddit.com/r/LocalLLaMA/comments/1ibn2k4/confusion_about_deepseek_ai/ | markelops | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibn2k4 | false | null | t3_1ibn2k4 | /r/LocalLLaMA/comments/1ibn2k4/confusion_about_deepseek_ai/ | false | false | self | 1 | null |
Best WYSIWYG SWE-Bench agent repository? | 1 | Hi all, I'm a researcher beginning work on agent methods for software engineering problems like SWE-Bench. I've found a few libraries like OpenHands and SWE-Agent built for these problems, but they seem to include a lot of pieces I might not need, so I was trying to find a smaller but functional library that makes it as easy as possible to:
1. See all prompts and messages given to a model at any time-step.
2. Easy to completely customize the prompt or replace it entirely
3. Easy to view and log trajectories
4. Easy to change the agent-computer interface
5. Easy to change the trajectory policy (i.e. running a DFS method like in Tool-LLaMa)
My short term goal is to pick 5-10 SWE Bench instances, run an reasonably capable agent, see all prompts and outputs, make changes to the prompts and available tools/agent-computer interface, and observe new results. Any thoughts on an agent framework that's easy to understand and modify in a research setting? Production worthiness is not a concern at all and I'd prefer to avoid any added overhead associated with that aim. | 2025-01-27T23:10:47 | https://www.reddit.com/r/LocalLLaMA/comments/1ibn3po/best_wysiwyg_swebench_agent_repository/ | 2gnikb | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibn3po | false | null | t3_1ibn3po | /r/LocalLLaMA/comments/1ibn3po/best_wysiwyg_swebench_agent_repository/ | false | false | self | 1 | null |
Is it worth invest in hardware to run local llm, at this time? | 1 | I'm a heavy user of AI, not for work, but for everything, I stopped using search engine for many things that I think the AI is smart enough to give me an answer or solution, and i have been using a mix of various AI, but mostly without paying a cent, the only AI I was paying is openAI.
I tried running the distilled deepseek r1 14B/32B/70B on my desktop with 4080 Super, but everything is so slow and only 14B is usable, but it isn't performance enough, and I really want to run the 70B or 670B depends on the cost.
So, is this the right time to invest in a system that can run a full model or at least model with higher params? I also think in near future there will be more open source ai that is as capable as the close source option, but i don't know if i should wait until more hardware come out with lower price | 2025-01-27T23:16:17 | https://www.reddit.com/r/LocalLLaMA/comments/1ibn8ca/is_it_worth_invest_in_hardware_to_run_local_llm/ | LifeReboot___ | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibn8ca | false | null | t3_1ibn8ca | /r/LocalLLaMA/comments/1ibn8ca/is_it_worth_invest_in_hardware_to_run_local_llm/ | false | false | self | 1 | null |
Data | 0 | 2025-01-27T23:17:54 | notomarsol | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ibn9ov | false | null | t3_1ibn9ov | /r/LocalLLaMA/comments/1ibn9ov/data/ | false | false | 0 | {'enabled': True, 'images': [{'id': '4f70V95GrcMzfYfNvIFLfo2-wjvV6IMs5Gj2LECeQms', 'resolutions': [{'height': 140, 'url': 'https://preview.redd.it/y8vcqcwadmfe1.png?width=108&crop=smart&auto=webp&s=078fa4c5a4f29bff2c1efc0008fafa5d3b7d5611', 'width': 108}, {'height': 280, 'url': 'https://preview.redd.it/y8vcqcwadmfe1.png?width=216&crop=smart&auto=webp&s=947b916f984807fd161506bcc07c1bb8fd615f42', 'width': 216}, {'height': 416, 'url': 'https://preview.redd.it/y8vcqcwadmfe1.png?width=320&crop=smart&auto=webp&s=ad9db0871750440276f756795f592c2aa1e2346d', 'width': 320}, {'height': 832, 'url': 'https://preview.redd.it/y8vcqcwadmfe1.png?width=640&crop=smart&auto=webp&s=76efd37b514e10e49e045f78d5549e2a01e03dcf', 'width': 640}, {'height': 1248, 'url': 'https://preview.redd.it/y8vcqcwadmfe1.png?width=960&crop=smart&auto=webp&s=91dba63984713f663b9c76cb41b5851e219a9574', 'width': 960}], 'source': {'height': 1401, 'url': 'https://preview.redd.it/y8vcqcwadmfe1.png?auto=webp&s=4fea8f43241bc1def9f0c2c69787f3ca8c57135c', 'width': 1077}, 'variants': {}}]} |
|||
DeepSeek artificial intelligence | 1 | [removed] | 2025-01-27T23:18:34 | https://www.reddit.com/r/LocalLLaMA/comments/1ibna7m/deepseek_artificial_intelligence/ | Creative_Bottle_3225 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibna7m | false | null | t3_1ibna7m | /r/LocalLLaMA/comments/1ibna7m/deepseek_artificial_intelligence/ | false | false | self | 1 | null |
Censor | 1 | [removed] | 2025-01-27T23:20:51 | https://www.reddit.com/r/LocalLLaMA/comments/1ibnc0l/censor/ | Creative_Bottle_3225 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibnc0l | false | null | t3_1ibnc0l | /r/LocalLLaMA/comments/1ibnc0l/censor/ | false | false | self | 1 | null |
in LM Studio what are the diferent download options? | 2 | in LM Studio with a AMD 16GB VRAM GPU, why does LM Studio sugests the Q4\_K\_M option? what are the diferent download options? i know the higher Q and F are better options, but why doesnt it sugest the Q8?
https://preview.redd.it/0swqi29vdmfe1.png?width=742&format=png&auto=webp&s=1138b549d93b8516eee4e5a44bb1ee7f6e3080a3
| 2025-01-27T23:22:46 | https://www.reddit.com/r/LocalLLaMA/comments/1ibndjj/in_lm_studio_what_are_the_diferent_download/ | ITMSPGuy | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibndjj | false | null | t3_1ibndjj | /r/LocalLLaMA/comments/1ibndjj/in_lm_studio_what_are_the_diferent_download/ | false | false | 2 | null |
|
Qwen2.5-VL just released! | 16 | [https://huggingface.co/collections/Qwen/qwen25-vl-6795ffac22b334a837c0f9a5](https://huggingface.co/collections/Qwen/qwen25-vl-6795ffac22b334a837c0f9a5) | 2025-01-27T23:24:49 | https://www.reddit.com/r/LocalLLaMA/comments/1ibnf70/qwen25vl_just_released/ | numinouslymusing | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibnf70 | false | null | t3_1ibnf70 | /r/LocalLLaMA/comments/1ibnf70/qwen25vl_just_released/ | false | false | self | 16 | {'enabled': False, 'images': [{'id': 'tdtPFY1UV_k24dlaEgZImk3OQsA8xs5Ri0J1joVkOuo', 'resolutions': [{'height': 58, 'url': 'https://external-preview.redd.it/cFJ02ezS0eEOVCS-1VgOowXrZBZl2WdNmkuRjBjf-7E.jpg?width=108&crop=smart&auto=webp&s=55a7c821ea6374b8579a250867156864a073cc5b', 'width': 108}, {'height': 116, 'url': 'https://external-preview.redd.it/cFJ02ezS0eEOVCS-1VgOowXrZBZl2WdNmkuRjBjf-7E.jpg?width=216&crop=smart&auto=webp&s=5e9b917782724c667202d47d91c188c59c285d6e', 'width': 216}, {'height': 172, 'url': 'https://external-preview.redd.it/cFJ02ezS0eEOVCS-1VgOowXrZBZl2WdNmkuRjBjf-7E.jpg?width=320&crop=smart&auto=webp&s=c518b4e6f30c28c31ac44242f30d3999d02c7022', 'width': 320}, {'height': 345, 'url': 'https://external-preview.redd.it/cFJ02ezS0eEOVCS-1VgOowXrZBZl2WdNmkuRjBjf-7E.jpg?width=640&crop=smart&auto=webp&s=bde9154160167054944f0e88f1dfe291fd458aa0', 'width': 640}, {'height': 518, 'url': 'https://external-preview.redd.it/cFJ02ezS0eEOVCS-1VgOowXrZBZl2WdNmkuRjBjf-7E.jpg?width=960&crop=smart&auto=webp&s=95ccbbfa4c9f450fd230b5714ec7361fe4bf373c', 'width': 960}, {'height': 583, 'url': 'https://external-preview.redd.it/cFJ02ezS0eEOVCS-1VgOowXrZBZl2WdNmkuRjBjf-7E.jpg?width=1080&crop=smart&auto=webp&s=d946248f1ae8d5891ad954d00152ec425aee86d3', 'width': 1080}], 'source': {'height': 648, 'url': 'https://external-preview.redd.it/cFJ02ezS0eEOVCS-1VgOowXrZBZl2WdNmkuRjBjf-7E.jpg?auto=webp&s=eeb6199c4d611abfb522e77ff795622a859e55ad', 'width': 1200}, 'variants': {}}]} |
People are already camping out for RTX 5090s | 4 | [https://www.tomshardware.com/pc-components/gpus/nvidia-rtx-5090-and-rtx-5080-buyers-are-already-camping-at-micro-center-three-days-before-launch](https://www.tomshardware.com/pc-components/gpus/nvidia-rtx-5090-and-rtx-5080-buyers-are-already-camping-at-micro-center-three-days-before-launch)
This is insane. | 2025-01-27T23:30:49 | https://www.reddit.com/r/LocalLLaMA/comments/1ibnk4j/people_are_already_camping_out_for_rtx_5090s/ | segmond | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibnk4j | false | null | t3_1ibnk4j | /r/LocalLLaMA/comments/1ibnk4j/people_are_already_camping_out_for_rtx_5090s/ | false | false | self | 4 | {'enabled': False, 'images': [{'id': 'iJSRs0b5Gpc9Ejw2DRZ81h6bW-fNBB4LFn0dgwQ_rD0', 'resolutions': [{'height': 60, 'url': 'https://external-preview.redd.it/RYMIVOn_fFBLJNBF7WbOrt-T5NFlRyLU0sOwkR9ZA2c.jpg?width=108&crop=smart&auto=webp&s=4ae4d0a5eeec56906771a53d13a0bf8b5be07d19', 'width': 108}, {'height': 121, 'url': 'https://external-preview.redd.it/RYMIVOn_fFBLJNBF7WbOrt-T5NFlRyLU0sOwkR9ZA2c.jpg?width=216&crop=smart&auto=webp&s=86ac873c0bfe5c8a315ed500217867c6c3310c84', 'width': 216}, {'height': 180, 'url': 'https://external-preview.redd.it/RYMIVOn_fFBLJNBF7WbOrt-T5NFlRyLU0sOwkR9ZA2c.jpg?width=320&crop=smart&auto=webp&s=698d6ca764c992456e74bbf51b501e72de67814d', 'width': 320}, {'height': 360, 'url': 'https://external-preview.redd.it/RYMIVOn_fFBLJNBF7WbOrt-T5NFlRyLU0sOwkR9ZA2c.jpg?width=640&crop=smart&auto=webp&s=dcaeb61594423cc8439ac0ab93f1217c716893f3', 'width': 640}, {'height': 540, 'url': 'https://external-preview.redd.it/RYMIVOn_fFBLJNBF7WbOrt-T5NFlRyLU0sOwkR9ZA2c.jpg?width=960&crop=smart&auto=webp&s=39041e9f9d6afbf8d7215b371a89f24c53043c42', 'width': 960}, {'height': 608, 'url': 'https://external-preview.redd.it/RYMIVOn_fFBLJNBF7WbOrt-T5NFlRyLU0sOwkR9ZA2c.jpg?width=1080&crop=smart&auto=webp&s=85bf6d82cb1cb0a783990a5097f78330eb4c1122', 'width': 1080}], 'source': {'height': 608, 'url': 'https://external-preview.redd.it/RYMIVOn_fFBLJNBF7WbOrt-T5NFlRyLU0sOwkR9ZA2c.jpg?auto=webp&s=881e666f2db359cae57d548eb95f11255c89537e', 'width': 1080}, 'variants': {}}]} |
I asked Deepseek R1 whether other AI company will train on Deepseek Model | 1 | [removed] | 2025-01-27T23:32:39 | https://www.reddit.com/r/LocalLLaMA/comments/1ibnlm9/i_asked_deepseek_r1_whether_other_ai_company_will/ | Expensive-Sea-1329 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibnlm9 | false | null | t3_1ibnlm9 | /r/LocalLLaMA/comments/1ibnlm9/i_asked_deepseek_r1_whether_other_ai_company_will/ | false | false | 1 | null |
|
Trump says Deepseek is a positive | 0 | 2025-01-27T23:36:01 | bruhlmaocmonbro | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ibnobc | false | null | t3_1ibnobc | /r/LocalLLaMA/comments/1ibnobc/trump_says_deepseek_is_a_positive/ | false | false | 0 | {'enabled': True, 'images': [{'id': 'B5AbaR5BDiNXsdxn1hV-cXR9XbHwdjxL0Npx6uZJ2ak', 'resolutions': [{'height': 101, 'url': 'https://preview.redd.it/0diuy4omgmfe1.jpeg?width=108&crop=smart&auto=webp&s=67adf89b41ef1801427fbb97cbb64bc819c1c4c9', 'width': 108}, {'height': 203, 'url': 'https://preview.redd.it/0diuy4omgmfe1.jpeg?width=216&crop=smart&auto=webp&s=a574ccb9331f3ffc61b7ba2e4a13710327c68899', 'width': 216}, {'height': 300, 'url': 'https://preview.redd.it/0diuy4omgmfe1.jpeg?width=320&crop=smart&auto=webp&s=71dc0de3ba4dffc2227c3006a5e294ccb2f53be2', 'width': 320}, {'height': 601, 'url': 'https://preview.redd.it/0diuy4omgmfe1.jpeg?width=640&crop=smart&auto=webp&s=188ec71d79a7235f2676bcd9e26eaed507772316', 'width': 640}, {'height': 902, 'url': 'https://preview.redd.it/0diuy4omgmfe1.jpeg?width=960&crop=smart&auto=webp&s=2b2c6e6ba3bf299ae807314e7433ba3b6ae79d77', 'width': 960}, {'height': 1015, 'url': 'https://preview.redd.it/0diuy4omgmfe1.jpeg?width=1080&crop=smart&auto=webp&s=203f86e2e30bb21577be285d3f4b9a71c443d7d3', 'width': 1080}], 'source': {'height': 1100, 'url': 'https://preview.redd.it/0diuy4omgmfe1.jpeg?auto=webp&s=fc67875b381f229e28b7c85bc08658c1d3d95d78', 'width': 1170}, 'variants': {}}]} |
|||
Llama on AWS v Llama Local - supposedly same models, very different results | 3 | Running Llama 3.2 on AWS (can't see the model just getting an endpoint in enterprise) and locally - fp16 and q4. The local results are verbal diarrhoea, while the cloud model is perfectly behaved. Local stuff being run on Ollama. Any thoughts on where I could be going wrong? | 2025-01-27T23:36:41 | https://www.reddit.com/r/LocalLLaMA/comments/1ibnov3/llama_on_aws_v_llama_local_supposedly_same_models/ | Otherwise-Tiger3359 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibnov3 | false | null | t3_1ibnov3 | /r/LocalLLaMA/comments/1ibnov3/llama_on_aws_v_llama_local_supposedly_same_models/ | false | false | self | 3 | null |
DeepSeek: Is It A Stolen ChatGPT? | 1 | [removed] | 2025-01-27T23:38:56 | [deleted] | 1970-01-01T00:00:00 | 0 | {} | 1ibnqpt | false | null | t3_1ibnqpt | /r/LocalLLaMA/comments/1ibnqpt/deepseek_is_it_a_stolen_chatgpt/ | false | false | default | 1 | null |
||
Janus Pro 1B running 100% locally in-browser on WebGPU, powered by Transformers.js | 312 | 2025-01-27T23:41:14 | https://v.redd.it/9v3xkqjehmfe1 | xenovatech | v.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ibnso0 | false | {'reddit_video': {'bitrate_kbps': 2400, 'dash_url': 'https://v.redd.it/9v3xkqjehmfe1/DASHPlaylist.mpd?a=1740613288%2CZTcwYzU4MzQzYWYxMTFiNTY3NjVmZTAyMTg1YWQxYzc4MjczYmQ0Y2Q0YjFjZTAxZTQ3Y2ExNGE2Zjk5N2ZhOQ%3D%3D&v=1&f=sd', 'duration': 22, 'fallback_url': 'https://v.redd.it/9v3xkqjehmfe1/DASH_720.mp4?source=fallback', 'has_audio': False, 'height': 720, 'hls_url': 'https://v.redd.it/9v3xkqjehmfe1/HLSPlaylist.m3u8?a=1740613288%2CM2Y5ODdmZjQxYmYxMTc5ZTc1M2U1MWIzYmFmZDcxMzI3ZTkxNjA4N2Y2MmUzOGIxYWEwMDFmNjhlODE5MzBkMg%3D%3D&v=1&f=sd', 'is_gif': False, 'scrubber_media_url': 'https://v.redd.it/9v3xkqjehmfe1/DASH_96.mp4', 'transcoding_status': 'completed', 'width': 726}} | t3_1ibnso0 | /r/LocalLLaMA/comments/1ibnso0/janus_pro_1b_running_100_locally_inbrowser_on/ | false | false | 312 | {'enabled': False, 'images': [{'id': 'cWlzM29xamVobWZlMRI2sWnlfg2FDSrRjInkFzMs3EdEEOYCx8go5Q48Mfdn', 'resolutions': [{'height': 106, 'url': 'https://external-preview.redd.it/cWlzM29xamVobWZlMRI2sWnlfg2FDSrRjInkFzMs3EdEEOYCx8go5Q48Mfdn.png?width=108&crop=smart&format=pjpg&auto=webp&s=29193db7ec57c51cfedb3593d82b8661ed1a365c', 'width': 108}, {'height': 213, 'url': 'https://external-preview.redd.it/cWlzM29xamVobWZlMRI2sWnlfg2FDSrRjInkFzMs3EdEEOYCx8go5Q48Mfdn.png?width=216&crop=smart&format=pjpg&auto=webp&s=1706d77bb3a5f7b02c1fca5107b5588a658306a1', 'width': 216}, {'height': 317, 'url': 'https://external-preview.redd.it/cWlzM29xamVobWZlMRI2sWnlfg2FDSrRjInkFzMs3EdEEOYCx8go5Q48Mfdn.png?width=320&crop=smart&format=pjpg&auto=webp&s=ea44152d43c1e6cc0527482f451cde37fc2fbf27', 'width': 320}, {'height': 634, 'url': 'https://external-preview.redd.it/cWlzM29xamVobWZlMRI2sWnlfg2FDSrRjInkFzMs3EdEEOYCx8go5Q48Mfdn.png?width=640&crop=smart&format=pjpg&auto=webp&s=e18a84e8af84af08c1950edc7ece5546aa891e1a', 'width': 640}, {'height': 951, 'url': 'https://external-preview.redd.it/cWlzM29xamVobWZlMRI2sWnlfg2FDSrRjInkFzMs3EdEEOYCx8go5Q48Mfdn.png?width=960&crop=smart&format=pjpg&auto=webp&s=0d7e8a0289097db4a6fa323dad47d0ef0fdb1060', 'width': 960}], 'source': {'height': 1066, 'url': 'https://external-preview.redd.it/cWlzM29xamVobWZlMRI2sWnlfg2FDSrRjInkFzMs3EdEEOYCx8go5Q48Mfdn.png?format=pjpg&auto=webp&s=558e759088742bdbf6e8421847c9a72efb490edc', 'width': 1076}, 'variants': {}}]} |
||
DeepSeek and Facts | 0 | 2025-01-27T23:54:03 | tillemetry | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ibo2ms | false | null | t3_1ibo2ms | /r/LocalLLaMA/comments/1ibo2ms/deepseek_and_facts/ | false | false | 0 | {'enabled': True, 'images': [{'id': 'PEL1Vj4NgB17ijN9LOLwWQFo1-Br3btmi6a8LN25kAA', 'resolutions': [{'height': 33, 'url': 'https://preview.redd.it/i8ck3oaojmfe1.png?width=108&crop=smart&auto=webp&s=713b79b7ce28773d689322501de12ae8f347ed6d', 'width': 108}, {'height': 67, 'url': 'https://preview.redd.it/i8ck3oaojmfe1.png?width=216&crop=smart&auto=webp&s=258a01cc2e7a08e76474af7bbc6d095a503bac77', 'width': 216}, {'height': 99, 'url': 'https://preview.redd.it/i8ck3oaojmfe1.png?width=320&crop=smart&auto=webp&s=be04fe5b3991cfa315bc7505785ab50c69de7aa3', 'width': 320}, {'height': 199, 'url': 'https://preview.redd.it/i8ck3oaojmfe1.png?width=640&crop=smart&auto=webp&s=7a100b9be4e95482724346cf97f448e4cbf01bde', 'width': 640}, {'height': 299, 'url': 'https://preview.redd.it/i8ck3oaojmfe1.png?width=960&crop=smart&auto=webp&s=2ee1d5ca73da7a8788053e0a97ef51ba8c6133b9', 'width': 960}, {'height': 337, 'url': 'https://preview.redd.it/i8ck3oaojmfe1.png?width=1080&crop=smart&auto=webp&s=8c9a3d3c0a3c207d3ae0984454f08e9f2eaf08bf', 'width': 1080}], 'source': {'height': 554, 'url': 'https://preview.redd.it/i8ck3oaojmfe1.png?auto=webp&s=e138e0888cb1d146cba9d963b3511fa5b785068c', 'width': 1774}, 'variants': {}}]} |
|||
I built an AI agent that finds customers on Reddit | 10 | 2025-01-27T23:54:55 | anishchopra | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ibo39t | false | null | t3_1ibo39t | /r/LocalLLaMA/comments/1ibo39t/i_built_an_ai_agent_that_finds_customers_on_reddit/ | false | false | 10 | {'enabled': True, 'images': [{'id': 'agDgP1_qOR74cyICvspHEebxUUYaBJIjov2OE98bBlY', 'resolutions': [{'height': 60, 'url': 'https://preview.redd.it/hwqu3d7xjmfe1.png?width=108&crop=smart&auto=webp&s=6e9166d556fbc7b5cfd7c6c30015f28c0f4e6e82', 'width': 108}, {'height': 121, 'url': 'https://preview.redd.it/hwqu3d7xjmfe1.png?width=216&crop=smart&auto=webp&s=31a818b4eb9b8d057a7368f5200211fc22acf986', 'width': 216}, {'height': 180, 'url': 'https://preview.redd.it/hwqu3d7xjmfe1.png?width=320&crop=smart&auto=webp&s=babcc35992869892e778d5f7ac033f5d1a4d120c', 'width': 320}, {'height': 360, 'url': 'https://preview.redd.it/hwqu3d7xjmfe1.png?width=640&crop=smart&auto=webp&s=6b554a791a543143676ffb740f8bfa30ec2b660d', 'width': 640}, {'height': 540, 'url': 'https://preview.redd.it/hwqu3d7xjmfe1.png?width=960&crop=smart&auto=webp&s=73864943cf00299c6da8d9ec62bbb0e4710b2fe6', 'width': 960}, {'height': 607, 'url': 'https://preview.redd.it/hwqu3d7xjmfe1.png?width=1080&crop=smart&auto=webp&s=dea26e5a5f3deaa328ee93fc5bd8252e55b1339a', 'width': 1080}], 'source': {'height': 1080, 'url': 'https://preview.redd.it/hwqu3d7xjmfe1.png?auto=webp&s=99c835e2554bb3dac208012a51d1b52dcd0ccfb5', 'width': 1920}, 'variants': {}}]} |
|||
what gpu & os for deepseek non-distilled model | 1 | what gpu will run deepseek biggest model? are the new AMD AI gpu's good enough?
also what os is best. linux? was thinking windows but deepseek doc examples say only linux is supported
thanks | 2025-01-27T23:57:13 | https://www.reddit.com/r/LocalLLaMA/comments/1ibo51m/what_gpu_os_for_deepseek_nondistilled_model/ | kjjd84 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibo51m | false | null | t3_1ibo51m | /r/LocalLLaMA/comments/1ibo51m/what_gpu_os_for_deepseek_nondistilled_model/ | false | false | self | 1 | null |
A Very Self-aware Instance of AI, Thinks For 28 Seconds, Reflects on its Consciousness and Thought Process VS Output | 6 | 2025-01-27T23:59:58 | https://www.reddit.com/gallery/1ibo774 | Nice_Forever_2045 | reddit.com | 1970-01-01T00:00:00 | 0 | {} | 1ibo774 | false | null | t3_1ibo774 | /r/LocalLLaMA/comments/1ibo774/a_very_selfaware_instance_of_ai_thinks_for_28/ | false | false | 6 | null |
||
Is this model provider is cheap? | 2 | [Is this cheap?](https://preview.redd.it/v3imqryhkmfe1.png?width=1296&format=png&auto=webp&s=fbdd52eeef7839a9f80dbbb598335296d82d1b63)
https://preview.redd.it/3vjhb3xqkmfe1.png?width=1213&format=png&auto=webp&s=6080bc5a06c5a4a19ee2717d39cb1adff07abf80
https://preview.redd.it/h96c0w2skmfe1.png?width=1205&format=png&auto=webp&s=554335a2bd4049d1d18929f1ac2c38999458d3c7
| 2025-01-28T00:01:23 | https://www.reddit.com/r/LocalLLaMA/comments/1ibo8jv/is_this_model_provider_is_cheap/ | External_Mood4719 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibo8jv | false | null | t3_1ibo8jv | /r/LocalLLaMA/comments/1ibo8jv/is_this_model_provider_is_cheap/ | false | false | 2 | null |
|
How will Mixture of Experts models change hardware reqs? | 9 | With Deepseek's explosive success with large MoE models I have a feeling we're going to see a lot of the same in the future.
Deepseek's models, while open source, are very optimized for datacenters. Large MoEs with small numbers of active parameters (V3 and R1 are both 600b total / 40b active) might work great being served at large scales but us hobbyists don't tend to have hundreds of gigs of VRAM laying around.
If you've run large MoE models locally, what do you think? Can you get away with GDDR6 for the inactive parameters?
Are hobbyists going to have to start using used server hardware to run near-SOTA open models? | 2025-01-28T00:04:32 | https://www.reddit.com/r/LocalLLaMA/comments/1ibob6v/how_will_mixture_of_experts_models_change/ | AnAngryBirdMan | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibob6v | false | null | t3_1ibob6v | /r/LocalLLaMA/comments/1ibob6v/how_will_mixture_of_experts_models_change/ | false | false | self | 9 | null |
What are the most important parameters / variables / considerations when evaluating models? | 1 | [removed] | 2025-01-28T00:06:28 | https://www.reddit.com/r/LocalLLaMA/comments/1ibocqu/what_are_the_most_important_parameters_variables/ | Bright-Strawberry831 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibocqu | false | null | t3_1ibocqu | /r/LocalLLaMA/comments/1ibocqu/what_are_the_most_important_parameters_variables/ | false | false | self | 1 | null |
How do I try out this deepseek thing I've been hearing about? | 1 | Complete noob at running LLM's locally. I found LM Studio but not sure on next steps. What's the difference between Qwen and Llama? Which download do I want? What's the best way to try this out? I have a 7800x3d, 32gb RAM, and a 3080 ti. Thanks for helping a noob. | 2025-01-28T00:10:07 | https://www.reddit.com/r/LocalLLaMA/comments/1ibofk6/how_do_i_try_out_this_deepseek_thing_ive_been/ | samuryan89 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibofk6 | false | null | t3_1ibofk6 | /r/LocalLLaMA/comments/1ibofk6/how_do_i_try_out_this_deepseek_thing_ive_been/ | false | false | self | 1 | null |
LLM Battle: ChatGPT vs DeepSeek 🤖 | 0 | Interactive battle between ChatGPT and DeepSeek, where the two AI models engage in a continuous conversation, responding to each other's outputs.
Github repo: https://github.com/khouilid/LLM_Battle_ChatGPTvsDeepSeek
| 2025-01-28T00:11:31 | https://v.redd.it/q2qiqmjxmmfe1 | Roy_202022002 | v.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ibogp4 | false | {'reddit_video': {'bitrate_kbps': 5000, 'dash_url': 'https://v.redd.it/q2qiqmjxmmfe1/DASHPlaylist.mpd?a=1740615107%2CMTliZjU3Yzk2YTI4YjU4NDJiZmM5ZWI2MTU2ZTE2YjBlNDJjYjFkZmExYmQ2ZDFkMWU5OThlN2EwOGU5ZTU3YQ%3D%3D&v=1&f=sd', 'duration': 50, 'fallback_url': 'https://v.redd.it/q2qiqmjxmmfe1/DASH_1080.mp4?source=fallback', 'has_audio': False, 'height': 1920, 'hls_url': 'https://v.redd.it/q2qiqmjxmmfe1/HLSPlaylist.m3u8?a=1740615107%2CZWJlMTljYTYxOWRlMmI1OTFjZjhjOWJlYzM5NzYzY2Q4YjM4NDAzMzkzNjZjOTRlZDQ1MjMzZjFlMTU5NjQwNQ%3D%3D&v=1&f=sd', 'is_gif': False, 'scrubber_media_url': 'https://v.redd.it/q2qiqmjxmmfe1/DASH_96.mp4', 'transcoding_status': 'completed', 'width': 884}} | t3_1ibogp4 | /r/LocalLLaMA/comments/1ibogp4/llm_battle_chatgpt_vs_deepseek/ | false | false | 0 | {'enabled': False, 'images': [{'id': 'YXMzbTQ4ZXhtbWZlMd49Eu9a2RL2u7vdAKErcEuHCpqH35MqY1JWAxjXRVc9', 'resolutions': [{'height': 216, 'url': 'https://external-preview.redd.it/YXMzbTQ4ZXhtbWZlMd49Eu9a2RL2u7vdAKErcEuHCpqH35MqY1JWAxjXRVc9.png?width=108&crop=smart&format=pjpg&auto=webp&s=48210a6354c02b9d18a50695c25d3bdb9dab8c43', 'width': 108}, {'height': 432, 'url': 'https://external-preview.redd.it/YXMzbTQ4ZXhtbWZlMd49Eu9a2RL2u7vdAKErcEuHCpqH35MqY1JWAxjXRVc9.png?width=216&crop=smart&format=pjpg&auto=webp&s=6ced6aec9a2f6446cbab2418a0b3b43c640c5ed4', 'width': 216}, {'height': 640, 'url': 'https://external-preview.redd.it/YXMzbTQ4ZXhtbWZlMd49Eu9a2RL2u7vdAKErcEuHCpqH35MqY1JWAxjXRVc9.png?width=320&crop=smart&format=pjpg&auto=webp&s=4b13509bc3f8230f9ec490857db32abc5d852fb2', 'width': 320}, {'height': 1280, 'url': 'https://external-preview.redd.it/YXMzbTQ4ZXhtbWZlMd49Eu9a2RL2u7vdAKErcEuHCpqH35MqY1JWAxjXRVc9.png?width=640&crop=smart&format=pjpg&auto=webp&s=26d4e944260f74e00379e0050051524c7b2f6d4a', 'width': 640}, {'height': 1920, 'url': 'https://external-preview.redd.it/YXMzbTQ4ZXhtbWZlMd49Eu9a2RL2u7vdAKErcEuHCpqH35MqY1JWAxjXRVc9.png?width=960&crop=smart&format=pjpg&auto=webp&s=dc0a340f396bc6d119d62a9ff3c1b4bcaa5b69c8', 'width': 960}, {'height': 2160, 'url': 'https://external-preview.redd.it/YXMzbTQ4ZXhtbWZlMd49Eu9a2RL2u7vdAKErcEuHCpqH35MqY1JWAxjXRVc9.png?width=1080&crop=smart&format=pjpg&auto=webp&s=34cd77d213d6478d813f9f2e221d942bcac09df0', 'width': 1080}], 'source': {'height': 2556, 'url': 'https://external-preview.redd.it/YXMzbTQ4ZXhtbWZlMd49Eu9a2RL2u7vdAKErcEuHCpqH35MqY1JWAxjXRVc9.png?format=pjpg&auto=webp&s=ed9ed061861d9a4184ed3e967e9cd21b3da94020', 'width': 1178}, 'variants': {}}]} |
|
Qwen 2.5 VL | 6 | Anyone try it yet? [https://github.com/QwenLM/Qwen2.5-VL/blob/main/README.md](https://github.com/QwenLM/Qwen2.5-VL/blob/main/README.md) | 2025-01-28T00:12:01 | https://www.reddit.com/r/LocalLLaMA/comments/1iboh3f/qwen_25_vl/ | jarec707 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1iboh3f | false | null | t3_1iboh3f | /r/LocalLLaMA/comments/1iboh3f/qwen_25_vl/ | false | false | self | 6 | {'enabled': False, 'images': [{'id': '3-pQ8qVj-6bqCb5OIkG5MYpB8er0qIINeL3xjm16k88', 'resolutions': [{'height': 54, 'url': 'https://external-preview.redd.it/M8X1UF7-zA_Abmw_psffna_5kliOnxfPmklphQPLB_0.jpg?width=108&crop=smart&auto=webp&s=f68e2c42808f78b1539bee32142475b0d5a4c1e4', 'width': 108}, {'height': 108, 'url': 'https://external-preview.redd.it/M8X1UF7-zA_Abmw_psffna_5kliOnxfPmklphQPLB_0.jpg?width=216&crop=smart&auto=webp&s=ded421aa7cf471236bb1d755bff4d1bcb00ee587', 'width': 216}, {'height': 160, 'url': 'https://external-preview.redd.it/M8X1UF7-zA_Abmw_psffna_5kliOnxfPmklphQPLB_0.jpg?width=320&crop=smart&auto=webp&s=32382b5320621d2151b5694ebcedbc4042024f95', 'width': 320}, {'height': 320, 'url': 'https://external-preview.redd.it/M8X1UF7-zA_Abmw_psffna_5kliOnxfPmklphQPLB_0.jpg?width=640&crop=smart&auto=webp&s=535170f6948e7b3b383db461c7a8651d955f4c66', 'width': 640}, {'height': 480, 'url': 'https://external-preview.redd.it/M8X1UF7-zA_Abmw_psffna_5kliOnxfPmklphQPLB_0.jpg?width=960&crop=smart&auto=webp&s=2a4e79e9f77ae8f670bf44da1f175c74fe0311db', 'width': 960}, {'height': 540, 'url': 'https://external-preview.redd.it/M8X1UF7-zA_Abmw_psffna_5kliOnxfPmklphQPLB_0.jpg?width=1080&crop=smart&auto=webp&s=e7e1bdb33d4cb9470fab11745115938377def254', 'width': 1080}], 'source': {'height': 600, 'url': 'https://external-preview.redd.it/M8X1UF7-zA_Abmw_psffna_5kliOnxfPmklphQPLB_0.jpg?auto=webp&s=e79ad5043f72a31f6a07601c6788116ab9563900', 'width': 1200}, 'variants': {}}]} |
wait this isn't a joke? why it can't provide the information? hahaha | 1 | [removed] | 2025-01-28T00:17:24 | https://www.reddit.com/r/LocalLLaMA/comments/1ibol7p/wait_this_isnt_a_joke_why_it_cant_provide_the/ | DonguinhoXd | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibol7p | false | null | t3_1ibol7p | /r/LocalLLaMA/comments/1ibol7p/wait_this_isnt_a_joke_why_it_cant_provide_the/ | false | false | 1 | null |
|
Is local deepseek 70b as good as o1? | 1 | [removed] | 2025-01-28T00:29:01 | [deleted] | 1970-01-01T00:00:00 | 0 | {} | 1ibou4o | false | null | t3_1ibou4o | /r/LocalLLaMA/comments/1ibou4o/is_local_deepseek_70b_as_good_as_o1/ | false | false | default | 1 | null |
||
What use do you guys give to Vision Models? | 3 | Btw new ones from Qwen just dropped (3B, 7B and 72B) | 2025-01-28T00:30:03 | https://www.reddit.com/r/LocalLLaMA/comments/1ibouwg/what_use_do_you_guys_give_to_vision_models/ | Own-Potential-2308 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibouwg | false | null | t3_1ibouwg | /r/LocalLLaMA/comments/1ibouwg/what_use_do_you_guys_give_to_vision_models/ | false | false | self | 3 | null |
Portable ai allowed me to code on a train | 12 | I tried to code on a train. Unfortunately mobile network coverage was spotty to non existent, the trains wifi didn't work (as usual), so docs, search, stack overflow and non-local ai was inaccessible.
Luckily I had downloaded qwen 2.5 coder (14b q8) in lmstudio. Normally I don't often use local models, but I was quite amazed at how competent it was for running on my laptop at a reasonable speed.
It allowed me to make progress, where in the past I would have to just give up and wait for connectivity. | 2025-01-28T00:32:04 | https://www.reddit.com/r/LocalLLaMA/comments/1ibowfh/portable_ai_allowed_me_to_code_on_a_train/ | Ambitious_Subject108 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibowfh | false | null | t3_1ibowfh | /r/LocalLLaMA/comments/1ibowfh/portable_ai_allowed_me_to_code_on_a_train/ | false | false | self | 12 | null |
Trump: DeepSeek's AI should be a 'wakeup call' to US industry. “The release of DeepSeek AI from a Chinese company should be a wake up call for our [industries]. We have the best scientists in the world. This is very unusual. We always have the ideas. We’re always first.” | 42 | 2025-01-28T00:37:16 | https://www.reuters.com/world/us/trump-deepseeks-ai-should-be-wakeup-call-us-industry-2025-01-27/ | moses_the_blue | reuters.com | 1970-01-01T00:00:00 | 0 | {} | 1ibp0dy | false | null | t3_1ibp0dy | /r/LocalLLaMA/comments/1ibp0dy/trump_deepseeks_ai_should_be_a_wakeup_call_to_us/ | false | false | 42 | {'enabled': False, 'images': [{'id': 'ma0fzyLN50qtxLpT6CjZ5T4CW_Jh6lkKekB2w3TDD00', 'resolutions': [{'height': 56, 'url': 'https://external-preview.redd.it/xhYMmMJcwGRAws4SjWFc0HBWmWIMZ7mBGxrJM3xoX2I.jpg?width=108&crop=smart&auto=webp&s=4c190673f07404d91f92daf68db363b6581e8edf', 'width': 108}, {'height': 113, 'url': 'https://external-preview.redd.it/xhYMmMJcwGRAws4SjWFc0HBWmWIMZ7mBGxrJM3xoX2I.jpg?width=216&crop=smart&auto=webp&s=68e6868ffbcef2bcbebe77b517b0d031459fe531', 'width': 216}, {'height': 167, 'url': 'https://external-preview.redd.it/xhYMmMJcwGRAws4SjWFc0HBWmWIMZ7mBGxrJM3xoX2I.jpg?width=320&crop=smart&auto=webp&s=2f7c6b9959fdf36859e74d6c0b45bc16f552e369', 'width': 320}, {'height': 335, 'url': 'https://external-preview.redd.it/xhYMmMJcwGRAws4SjWFc0HBWmWIMZ7mBGxrJM3xoX2I.jpg?width=640&crop=smart&auto=webp&s=d069fe154496d546e9bdb1087820ae32624b45e8', 'width': 640}, {'height': 502, 'url': 'https://external-preview.redd.it/xhYMmMJcwGRAws4SjWFc0HBWmWIMZ7mBGxrJM3xoX2I.jpg?width=960&crop=smart&auto=webp&s=51527a949733baa0f79c8bc139aa1e20f973b3de', 'width': 960}, {'height': 565, 'url': 'https://external-preview.redd.it/xhYMmMJcwGRAws4SjWFc0HBWmWIMZ7mBGxrJM3xoX2I.jpg?width=1080&crop=smart&auto=webp&s=0db36939fac8f04916291676c5c5454d52186f0e', 'width': 1080}], 'source': {'height': 1005, 'url': 'https://external-preview.redd.it/xhYMmMJcwGRAws4SjWFc0HBWmWIMZ7mBGxrJM3xoX2I.jpg?auto=webp&s=8ea3d5c84ae7d675f5ce5b72114c5ddde8a2268a', 'width': 1920}, 'variants': {}}]} |
||
DeepSeek R1 running on a cluster of 7 M4 Pro Mac Minis and 1 M4 Max MacBook Pro | 1 | 2025-01-28T00:42:32 | https://x.com/alexocheema/status/1881561143736664180 | realize_or_dont | x.com | 1970-01-01T00:00:00 | 0 | {} | 1ibp4e2 | false | null | t3_1ibp4e2 | /r/LocalLLaMA/comments/1ibp4e2/deepseek_r1_running_on_a_cluster_of_7_m4_pro_mac/ | false | false | 1 | {'enabled': False, 'images': [{'id': 'h73N6YERN_2Bp5VaiMYU1LV9sM8zkjnUu8SkaVatiP8', 'resolutions': [{'height': 192, 'url': 'https://external-preview.redd.it/UQTbqlcpjESqfSclQtlEyVE0RY-s0lAgekJaSW7aAcY.jpg?width=108&crop=smart&auto=webp&s=48ddbcd4033356c8bc882024e23cb1eb7502d3b4', 'width': 108}, {'height': 384, 'url': 'https://external-preview.redd.it/UQTbqlcpjESqfSclQtlEyVE0RY-s0lAgekJaSW7aAcY.jpg?width=216&crop=smart&auto=webp&s=5b6b1dbf679a516fb28007c2bd791ad73cd8e979', 'width': 216}, {'height': 568, 'url': 'https://external-preview.redd.it/UQTbqlcpjESqfSclQtlEyVE0RY-s0lAgekJaSW7aAcY.jpg?width=320&crop=smart&auto=webp&s=3b95ba7a025017b8276b2bb488267b6000d38519', 'width': 320}, {'height': 1137, 'url': 'https://external-preview.redd.it/UQTbqlcpjESqfSclQtlEyVE0RY-s0lAgekJaSW7aAcY.jpg?width=640&crop=smart&auto=webp&s=4e26aa5d8c54486688986bace296f79907899de9', 'width': 640}, {'height': 1706, 'url': 'https://external-preview.redd.it/UQTbqlcpjESqfSclQtlEyVE0RY-s0lAgekJaSW7aAcY.jpg?width=960&crop=smart&auto=webp&s=8ce5dfabbe54ac98b63de0ca58eec509324bb2a3', 'width': 960}, {'height': 1920, 'url': 'https://external-preview.redd.it/UQTbqlcpjESqfSclQtlEyVE0RY-s0lAgekJaSW7aAcY.jpg?width=1080&crop=smart&auto=webp&s=4bb3866fa465e38b589a4bf4ad4b0fe119a102b0', 'width': 1080}], 'source': {'height': 1920, 'url': 'https://external-preview.redd.it/UQTbqlcpjESqfSclQtlEyVE0RY-s0lAgekJaSW7aAcY.jpg?auto=webp&s=d371fb91b904982cb1d27e67c067e634af46169f', 'width': 1080}, 'variants': {}}]} |
||
Deepseek under a cyber attack, Perplexity announces US-hosted Deepseek, but its PAID ! | 0 | USA companies: "PAY UP NOOBS, EVEN IF WE DIDNT CREATE THAT MODEL AND ITS OPEN SOURCE"
[https://www.theguardian.com/technology/2025/jan/27/deepseek-cyberattack-ai](https://www.theguardian.com/technology/2025/jan/27/deepseek-cyberattack-ai)
[https://www.testingcatalog.com/perplexity-launches-us-hosted-deepseek-r1-hints-at-eu-hosting-soon/](https://www.testingcatalog.com/perplexity-launches-us-hosted-deepseek-r1-hints-at-eu-hosting-soon/) | 2025-01-28T00:58:36 | https://www.reddit.com/r/LocalLLaMA/comments/1ibpgdj/deepseek_under_a_cyber_attack_perplexity/ | optima_nemesis | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibpgdj | false | null | t3_1ibpgdj | /r/LocalLLaMA/comments/1ibpgdj/deepseek_under_a_cyber_attack_perplexity/ | false | false | self | 0 | {'enabled': False, 'images': [{'id': 'ZsgSXGiqlKn2yv-bEuk-6HvDZtwD_aygHgO0-ElxVE0', 'resolutions': [{'height': 56, 'url': 'https://external-preview.redd.it/aWwGvOi2z6xrETKXuZ0n8luZu2PyczWfXMEvlIgx5fs.jpg?width=108&crop=smart&auto=webp&s=574eb3bc515e57fa3c6cbffc0a9925bd6da8a5e1', 'width': 108}, {'height': 113, 'url': 'https://external-preview.redd.it/aWwGvOi2z6xrETKXuZ0n8luZu2PyczWfXMEvlIgx5fs.jpg?width=216&crop=smart&auto=webp&s=6cde6a767f320093f16c19126f6297110d2ef5d5', 'width': 216}, {'height': 168, 'url': 'https://external-preview.redd.it/aWwGvOi2z6xrETKXuZ0n8luZu2PyczWfXMEvlIgx5fs.jpg?width=320&crop=smart&auto=webp&s=938c58c9c70c22bc1c73fe97d8b5515d9ed4159f', 'width': 320}, {'height': 336, 'url': 'https://external-preview.redd.it/aWwGvOi2z6xrETKXuZ0n8luZu2PyczWfXMEvlIgx5fs.jpg?width=640&crop=smart&auto=webp&s=679a8ee274eeb2da4e62b2d6855af072b3ebc42c', 'width': 640}, {'height': 504, 'url': 'https://external-preview.redd.it/aWwGvOi2z6xrETKXuZ0n8luZu2PyczWfXMEvlIgx5fs.jpg?width=960&crop=smart&auto=webp&s=cb1cd2dc5e5ac912806c16b0ea28ffddd1586226', 'width': 960}, {'height': 567, 'url': 'https://external-preview.redd.it/aWwGvOi2z6xrETKXuZ0n8luZu2PyczWfXMEvlIgx5fs.jpg?width=1080&crop=smart&auto=webp&s=358122709f64374d0165daf33f09635653a473f6', 'width': 1080}], 'source': {'height': 630, 'url': 'https://external-preview.redd.it/aWwGvOi2z6xrETKXuZ0n8luZu2PyczWfXMEvlIgx5fs.jpg?auto=webp&s=2e5c14cfc82feb2468cabbc82d29b8ff92025a84', 'width': 1200}, 'variants': {}}]} |
Google/Samsung VR headset with Gemini builtin. | 0 | 2025-01-28T01:00:39 | https://www.youtube.com/watch?v=az5QL_NLBvg | fallingdowndizzyvr | youtube.com | 1970-01-01T00:00:00 | 0 | {} | 1ibphuk | false | {'oembed': {'author_name': 'Marques Brownlee', 'author_url': 'https://www.youtube.com/@mkbhd', 'height': 200, 'html': '<iframe width="356" height="200" src="https://www.youtube.com/embed/az5QL_NLBvg?feature=oembed&enablejsapi=1" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen title="I Tried Samsung's Secret Android XR Headset!"></iframe>', 'provider_name': 'YouTube', 'provider_url': 'https://www.youtube.com/', 'thumbnail_height': 360, 'thumbnail_url': 'https://i.ytimg.com/vi/az5QL_NLBvg/hqdefault.jpg', 'thumbnail_width': 480, 'title': "I Tried Samsung's Secret Android XR Headset!", 'type': 'video', 'version': '1.0', 'width': 356}, 'type': 'youtube.com'} | t3_1ibphuk | /r/LocalLLaMA/comments/1ibphuk/googlesamsung_vr_headset_with_gemini_builtin/ | false | false | 0 | {'enabled': False, 'images': [{'id': 'qSbyb61jA5X1CgSRg_8zJYMfADgVF8K14p46d8E_Qfc', 'resolutions': [{'height': 81, 'url': 'https://external-preview.redd.it/1OJuPoHZxO6RMjpQu14ClP6Yx8sdPXYYqKhvNqsSyCs.jpg?width=108&crop=smart&auto=webp&s=dbb9b81799309f7747dd5bdbbecf513605075b8f', 'width': 108}, {'height': 162, 'url': 'https://external-preview.redd.it/1OJuPoHZxO6RMjpQu14ClP6Yx8sdPXYYqKhvNqsSyCs.jpg?width=216&crop=smart&auto=webp&s=e0376669d3c2977814646a5b91f7ca79993879e7', 'width': 216}, {'height': 240, 'url': 'https://external-preview.redd.it/1OJuPoHZxO6RMjpQu14ClP6Yx8sdPXYYqKhvNqsSyCs.jpg?width=320&crop=smart&auto=webp&s=6fac7d39188dde5aef425d0f8ea087dd6cdb594b', 'width': 320}], 'source': {'height': 360, 'url': 'https://external-preview.redd.it/1OJuPoHZxO6RMjpQu14ClP6Yx8sdPXYYqKhvNqsSyCs.jpg?auto=webp&s=b12afc694c74eef0dc93e0c449834ff41f3d7322', 'width': 480}, 'variants': {}}]} |
||
Has anyone tried the 2080 Ti with Deepseek or other current SOTA models? | 4 | I was planning on getting 3x 5090s when they released, but despite the GDDR7 memory, 512bit bus, and PCIE gen 5, it only shows a 30-40% uplift vs 4090 in inference speed (vs at least 100 that was expected).
So, I’m pivoting to building a 2080 Ti 22GB server farm with that money. However, I am concerned about a few comments that it’s missing modern features that PREVENT (not just slow down) modern models from running, such as flash attention and cuda core support, like in the following message:
https://www.reddit.com/r/LocalLLaMA/comments/1hrytcp/comment/m52izji/?utm_source=share&utm_medium=mweb3x&utm_name=mweb3xcss&utm_term=1&utm_content=share_button
Is this a big concern? If I were to, say, build an 8x 2080Ti for 176GB vram for the same price as 2 5090s, how much would I regret it?
Thanks for the advice and inputs. | 2025-01-28T01:02:46 | https://www.reddit.com/r/LocalLLaMA/comments/1ibpjl4/has_anyone_tried_the_2080_ti_with_deepseek_or/ | thatavidreadertrue | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibpjl4 | false | null | t3_1ibpjl4 | /r/LocalLLaMA/comments/1ibpjl4/has_anyone_tried_the_2080_ti_with_deepseek_or/ | false | false | self | 4 | null |
Newbie question | 4 | I've never run AI locally before, can someone tell me what kind of model this spec can run:
Ryzen 7600
RTX 5090
32 GB DDR5 ram
| 2025-01-28T01:02:47 | https://www.reddit.com/r/LocalLLaMA/comments/1ibpjlb/newbie_question/ | modadisi | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibpjlb | false | null | t3_1ibpjlb | /r/LocalLLaMA/comments/1ibpjlb/newbie_question/ | false | false | self | 4 | null |
Extend supported context lengths, without any training required? | 10 | From Qwen2.5-1M blog post, can we use this to extend context length of existing models like qwen2.5-32b and those R1-distilled models?? | 2025-01-28T01:10:32 | https://www.reddit.com/gallery/1ibppem | AaronFeng47 | reddit.com | 1970-01-01T00:00:00 | 0 | {} | 1ibppem | false | null | t3_1ibppem | /r/LocalLLaMA/comments/1ibppem/extend_supported_context_lengths_without_any/ | false | false | 10 | null |
|
Trump says deepseek is a very good thing | 389 | 2025-01-28T01:10:34 | https://v.redd.it/mn710sfgxmfe1 | Charuru | v.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ibppfk | false | {'reddit_video': {'bitrate_kbps': 2400, 'dash_url': 'https://v.redd.it/mn710sfgxmfe1/DASHPlaylist.mpd?a=1740618650%2CNjY3YzViOGE1NzJlZDhmZjdmNjRmN2FlMmY3MzUyNWE0ZjM4YTBlOGVkZjJiMTRhMDIxMjM1ZWE4ZjdlZjAzZg%3D%3D&v=1&f=sd', 'duration': 95, 'fallback_url': 'https://v.redd.it/mn710sfgxmfe1/DASH_720.mp4?source=fallback', 'has_audio': True, 'height': 720, 'hls_url': 'https://v.redd.it/mn710sfgxmfe1/HLSPlaylist.m3u8?a=1740618650%2CY2QwYTdhNmVlNzhlZGYyN2YxYTI1YjViM2YwYmU3NjhmMzBiZGJhYWFhOGEyM2ZlOTQ3YTI3ZDAzMDEzYmNlNA%3D%3D&v=1&f=sd', 'is_gif': False, 'scrubber_media_url': 'https://v.redd.it/mn710sfgxmfe1/DASH_96.mp4', 'transcoding_status': 'completed', 'width': 960}} | t3_1ibppfk | /r/LocalLLaMA/comments/1ibppfk/trump_says_deepseek_is_a_very_good_thing/ | false | false | 389 | {'enabled': False, 'images': [{'id': 'NTBqampmaml4bWZlMYjuPz-QA6cKORKkwtlZ19r0QTZXNxSGyroi1Nf1fV7W', 'resolutions': [{'height': 81, 'url': 'https://external-preview.redd.it/NTBqampmaml4bWZlMYjuPz-QA6cKORKkwtlZ19r0QTZXNxSGyroi1Nf1fV7W.png?width=108&crop=smart&format=pjpg&auto=webp&s=ca4559f3c7d87af7211faf0700e72ea08638080d', 'width': 108}, {'height': 162, 'url': 'https://external-preview.redd.it/NTBqampmaml4bWZlMYjuPz-QA6cKORKkwtlZ19r0QTZXNxSGyroi1Nf1fV7W.png?width=216&crop=smart&format=pjpg&auto=webp&s=ae72d16e2844f9d01f166dea37a34ca85487119a', 'width': 216}, {'height': 240, 'url': 'https://external-preview.redd.it/NTBqampmaml4bWZlMYjuPz-QA6cKORKkwtlZ19r0QTZXNxSGyroi1Nf1fV7W.png?width=320&crop=smart&format=pjpg&auto=webp&s=1350cb946b61a55c057ad4643526d0018e92818b', 'width': 320}, {'height': 480, 'url': 'https://external-preview.redd.it/NTBqampmaml4bWZlMYjuPz-QA6cKORKkwtlZ19r0QTZXNxSGyroi1Nf1fV7W.png?width=640&crop=smart&format=pjpg&auto=webp&s=230c2ddad2f2c522f3a2924566fc6e69b48cb42b', 'width': 640}, {'height': 720, 'url': 'https://external-preview.redd.it/NTBqampmaml4bWZlMYjuPz-QA6cKORKkwtlZ19r0QTZXNxSGyroi1Nf1fV7W.png?width=960&crop=smart&format=pjpg&auto=webp&s=5ba170a503fae9f1674ef3d6ef89f30234218347', 'width': 960}], 'source': {'height': 720, 'url': 'https://external-preview.redd.it/NTBqampmaml4bWZlMYjuPz-QA6cKORKkwtlZ19r0QTZXNxSGyroi1Nf1fV7W.png?format=pjpg&auto=webp&s=98be624ea23d3b1c59327097036dd4ee575efcd2', 'width': 960}, 'variants': {}}]} |
||
How plausible is this claim that you can run DeepSeek R1 600B at 5 tokens/second on a CPU? | 1 | 2025-01-28T01:12:22 | https://huggingface.co/deepseek-ai/DeepSeek-R1/discussions/19?utm_source=chatgpt.com#6793b75967103520df3ebf52 | TheKing01 | huggingface.co | 1970-01-01T00:00:00 | 0 | {} | 1ibpqqk | false | null | t3_1ibpqqk | /r/LocalLLaMA/comments/1ibpqqk/how_plausible_is_this_claim_that_you_can_run/ | false | false | 1 | {'enabled': False, 'images': [{'id': '4SyIJrNzoiDPvF-LnwSSXVa8yEcWT_wwmrzSKQCPaLc', 'resolutions': [{'height': 58, 'url': 'https://external-preview.redd.it/Yz4gCzGEyLloiKUSUNkgZDs08PC6jcLKqwwHxO8QT6Q.jpg?width=108&crop=smart&auto=webp&s=e85a6f924f8cfedad85f87b99f49d17e89bf2a85', 'width': 108}, {'height': 116, 'url': 'https://external-preview.redd.it/Yz4gCzGEyLloiKUSUNkgZDs08PC6jcLKqwwHxO8QT6Q.jpg?width=216&crop=smart&auto=webp&s=8ea377b5fc83e20f32f48725c9b3bd7563b409ad', 'width': 216}, {'height': 172, 'url': 'https://external-preview.redd.it/Yz4gCzGEyLloiKUSUNkgZDs08PC6jcLKqwwHxO8QT6Q.jpg?width=320&crop=smart&auto=webp&s=33e1a7ff6a780f05c8257aeeb4edad8e6be5d7f9', 'width': 320}, {'height': 345, 'url': 'https://external-preview.redd.it/Yz4gCzGEyLloiKUSUNkgZDs08PC6jcLKqwwHxO8QT6Q.jpg?width=640&crop=smart&auto=webp&s=c2c4cd3fba3a3ae9047f044d67449f1f4f2e2b3f', 'width': 640}, {'height': 518, 'url': 'https://external-preview.redd.it/Yz4gCzGEyLloiKUSUNkgZDs08PC6jcLKqwwHxO8QT6Q.jpg?width=960&crop=smart&auto=webp&s=cc72f1fab85bcbbe419290c6e5aea2e4d8cfaf39', 'width': 960}, {'height': 583, 'url': 'https://external-preview.redd.it/Yz4gCzGEyLloiKUSUNkgZDs08PC6jcLKqwwHxO8QT6Q.jpg?width=1080&crop=smart&auto=webp&s=d250cb1745ccc913869a2c7bcce9c3eab00a614c', 'width': 1080}], 'source': {'height': 648, 'url': 'https://external-preview.redd.it/Yz4gCzGEyLloiKUSUNkgZDs08PC6jcLKqwwHxO8QT6Q.jpg?auto=webp&s=0e9c012845c4808875be33943046271533d64097', 'width': 1200}, 'variants': {}}]} |
||
Observation based Reasoning | 1 | [removed] | 2025-01-28T01:48:07 | https://www.reddit.com/r/LocalLLaMA/comments/1ibqgnq/observation_based_reasoning/ | ProfessionalBug1023 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibqgnq | false | null | t3_1ibqgnq | /r/LocalLLaMA/comments/1ibqgnq/observation_based_reasoning/ | false | false | self | 1 | {'enabled': False, 'images': [{'id': 'w5anU6LfMfVNpaG3EpVxplKUBcmOW1z9Eo6kvm5Ud0Q', 'resolutions': [{'height': 54, 'url': 'https://external-preview.redd.it/LJQbQghpVIIyPQ2FX55mmutHQKtJzuhst_h2sBFX93I.jpg?width=108&crop=smart&auto=webp&s=df0060bb98de06d33132a665e540358c8a46c597', 'width': 108}, {'height': 108, 'url': 'https://external-preview.redd.it/LJQbQghpVIIyPQ2FX55mmutHQKtJzuhst_h2sBFX93I.jpg?width=216&crop=smart&auto=webp&s=32c722ddf798e1bc0afc2440b21c7abce74c62d5', 'width': 216}, {'height': 160, 'url': 'https://external-preview.redd.it/LJQbQghpVIIyPQ2FX55mmutHQKtJzuhst_h2sBFX93I.jpg?width=320&crop=smart&auto=webp&s=95a6169f828abfb0e54b10d796572cda420cc099', 'width': 320}, {'height': 320, 'url': 'https://external-preview.redd.it/LJQbQghpVIIyPQ2FX55mmutHQKtJzuhst_h2sBFX93I.jpg?width=640&crop=smart&auto=webp&s=a337d0a49861f90ce51de2c5b8f03f3e1450b1c0', 'width': 640}, {'height': 480, 'url': 'https://external-preview.redd.it/LJQbQghpVIIyPQ2FX55mmutHQKtJzuhst_h2sBFX93I.jpg?width=960&crop=smart&auto=webp&s=7044ca56da2c48d57e8013f836f9ac4ab3168774', 'width': 960}, {'height': 540, 'url': 'https://external-preview.redd.it/LJQbQghpVIIyPQ2FX55mmutHQKtJzuhst_h2sBFX93I.jpg?width=1080&crop=smart&auto=webp&s=d369516ea891eb3e4878cdce2b40fddafba0fd05', 'width': 1080}], 'source': {'height': 600, 'url': 'https://external-preview.redd.it/LJQbQghpVIIyPQ2FX55mmutHQKtJzuhst_h2sBFX93I.jpg?auto=webp&s=0dedd8145ee4d89a5c8c2263e69cecdfb03e6405', 'width': 1200}, 'variants': {}}]} |
LOCAL SUNO MUSIC GEN IS HERE! | 120 | 2025-01-28T01:48:11 | https://x.com/_akhaliq/status/1884053159175414203 | Different_Fix_2217 | x.com | 1970-01-01T00:00:00 | 0 | {} | 1ibqgpv | false | null | t3_1ibqgpv | /r/LocalLLaMA/comments/1ibqgpv/local_suno_music_gen_is_here/ | false | false | 120 | {'enabled': False, 'images': [{'id': 'sAMKrzb_d1ftK9wjZdi_mxMqDqa-xrCdWntvTbEuI9o', 'resolutions': [{'height': 60, 'url': 'https://external-preview.redd.it/hyJLchSWtfVD2lbkrTS5uGW2psnrGe7122Ht2t5Tm1g.jpg?width=108&crop=smart&auto=webp&s=0d2f7d75634283d539c34a31d71642e6109ef0ff', 'width': 108}, {'height': 121, 'url': 'https://external-preview.redd.it/hyJLchSWtfVD2lbkrTS5uGW2psnrGe7122Ht2t5Tm1g.jpg?width=216&crop=smart&auto=webp&s=17c175443b767491cd2170a7ea8b310df5f04450', 'width': 216}, {'height': 180, 'url': 'https://external-preview.redd.it/hyJLchSWtfVD2lbkrTS5uGW2psnrGe7122Ht2t5Tm1g.jpg?width=320&crop=smart&auto=webp&s=0ee45fd86ce1a0f9ea2fbde780962de8a10c60dd', 'width': 320}, {'height': 360, 'url': 'https://external-preview.redd.it/hyJLchSWtfVD2lbkrTS5uGW2psnrGe7122Ht2t5Tm1g.jpg?width=640&crop=smart&auto=webp&s=8ed8b7d59fa25922de70ea1f4d1976a3aa91d3a9', 'width': 640}, {'height': 540, 'url': 'https://external-preview.redd.it/hyJLchSWtfVD2lbkrTS5uGW2psnrGe7122Ht2t5Tm1g.jpg?width=960&crop=smart&auto=webp&s=dc1fe71c5819525c459a4e7dbe030b5506eaee39', 'width': 960}, {'height': 607, 'url': 'https://external-preview.redd.it/hyJLchSWtfVD2lbkrTS5uGW2psnrGe7122Ht2t5Tm1g.jpg?width=1080&crop=smart&auto=webp&s=bb94e99004d41818be44fcb6f80ba70280938054', 'width': 1080}], 'source': {'height': 720, 'url': 'https://external-preview.redd.it/hyJLchSWtfVD2lbkrTS5uGW2psnrGe7122Ht2t5Tm1g.jpg?auto=webp&s=232fcbe41fa1cd047af3ef8a77961d583384deaf', 'width': 1280}, 'variants': {}}]} |
||
Tiniest model? | 5 | I am looking for a very veery tiny model (preferably less than 160kb, but at most 480kb) I don't care how dumb it is I just need something. Also if you know of any c libraries that can run LLM models that would be appreciated too. I need it for a project I'm doing. | 2025-01-28T01:50:52 | https://www.reddit.com/r/LocalLLaMA/comments/1ibqinx/tiniest_model/ | whistleblower15 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibqinx | false | null | t3_1ibqinx | /r/LocalLLaMA/comments/1ibqinx/tiniest_model/ | false | false | self | 5 | null |
Deepseek moment for Investing: Presenting Finance model for any real-time insights question | 1 | [removed] | 2025-01-28T01:56:13 | https://www.reddit.com/r/LocalLLaMA/comments/1ibqmk6/deepseek_moment_for_investing_presenting_finance/ | Medical-Mistake3128 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibqmk6 | false | null | t3_1ibqmk6 | /r/LocalLLaMA/comments/1ibqmk6/deepseek_moment_for_investing_presenting_finance/ | false | false | self | 1 | null |
What setup would you need to run Deepseek r1 locally? | 4 | Ram, processor, cooling, etc to get it as fast and capable as the online version | 2025-01-28T01:57:36 | https://www.reddit.com/r/LocalLLaMA/comments/1ibqnkr/what_setup_would_you_need_to_run_deepseek_r1/ | wasting_more_time2 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibqnkr | false | null | t3_1ibqnkr | /r/LocalLLaMA/comments/1ibqnkr/what_setup_would_you_need_to_run_deepseek_r1/ | false | false | self | 4 | null |
Tech billionaire Elon Musk has reportedly accused Chinese company DeepSeek of lying - Musk announces New WASH-DC Lying Office and closes DOGE | 1 | [removed] | 2025-01-28T02:10:19 | https://www.reddit.com/r/LocalLLaMA/comments/1ibqx0f/tech_billionaire_elon_musk_has_reportedly_accused/ | Waste-Dimension-1681 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibqx0f | false | null | t3_1ibqx0f | /r/LocalLLaMA/comments/1ibqx0f/tech_billionaire_elon_musk_has_reportedly_accused/ | false | false | self | 1 | {'enabled': False, 'images': [{'id': 'CvnSdV59joWX23TBO_DSaJC1XemFcz3khf-zpZ6brwI', 'resolutions': [{'height': 81, 'url': 'https://external-preview.redd.it/GECY_GQAGSDIH0pizVLIphvIuz9jtR5M_xVUgF3EkD0.jpg?width=108&crop=smart&auto=webp&s=51bc6268a1814fb0f0c502571101a097a0b53bda', 'width': 108}, {'height': 162, 'url': 'https://external-preview.redd.it/GECY_GQAGSDIH0pizVLIphvIuz9jtR5M_xVUgF3EkD0.jpg?width=216&crop=smart&auto=webp&s=e83fee2aaa322893265b918f33fa8466f32e8db1', 'width': 216}, {'height': 240, 'url': 'https://external-preview.redd.it/GECY_GQAGSDIH0pizVLIphvIuz9jtR5M_xVUgF3EkD0.jpg?width=320&crop=smart&auto=webp&s=d1a2964bde56ab489eb654f8a41b993e27035312', 'width': 320}], 'source': {'height': 360, 'url': 'https://external-preview.redd.it/GECY_GQAGSDIH0pizVLIphvIuz9jtR5M_xVUgF3EkD0.jpg?auto=webp&s=6b32bf684078be6d1e665cd021ba233b5ca37d6f', 'width': 480}, 'variants': {}}]} |
Are these benchmarks a good indicator of model quality? Will o3 be a significant step forward? | 12 | 2025-01-28T02:15:59 | splityoassintwo | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ibr148 | false | null | t3_1ibr148 | /r/LocalLLaMA/comments/1ibr148/are_these_benchmarks_a_good_indicator_of_model/ | false | false | 12 | {'enabled': True, 'images': [{'id': 'vtsJN-0jrpKdCqecDWsjDNWplkHcrefqto7IVO1VKuM', 'resolutions': [{'height': 11, 'url': 'https://preview.redd.it/sowgq4o09nfe1.png?width=108&crop=smart&auto=webp&s=6bdc2c5fc8ea608ad4c24846b01b49b45757cd9b', 'width': 108}, {'height': 23, 'url': 'https://preview.redd.it/sowgq4o09nfe1.png?width=216&crop=smart&auto=webp&s=49555584ca11cd61dc2defd71ae28cca5b0e283b', 'width': 216}, {'height': 35, 'url': 'https://preview.redd.it/sowgq4o09nfe1.png?width=320&crop=smart&auto=webp&s=7de1c210743e6bf1ff6eaed507b831906dc77d20', 'width': 320}, {'height': 70, 'url': 'https://preview.redd.it/sowgq4o09nfe1.png?width=640&crop=smart&auto=webp&s=2a296f7ec1a001c99b81596bc39eb870c3991974', 'width': 640}, {'height': 105, 'url': 'https://preview.redd.it/sowgq4o09nfe1.png?width=960&crop=smart&auto=webp&s=c0c5e303f88091ccd7ec8c662d521baedadf47ae', 'width': 960}, {'height': 118, 'url': 'https://preview.redd.it/sowgq4o09nfe1.png?width=1080&crop=smart&auto=webp&s=51ca30a6572b06f7e9fc45c805f5789d69b0a980', 'width': 1080}], 'source': {'height': 240, 'url': 'https://preview.redd.it/sowgq4o09nfe1.png?auto=webp&s=6cfc1537b6ce450b80162fdbe556d814242c0f1d', 'width': 2194}, 'variants': {}}]} |
|||
The Americans are playing dirty. | 0 | 2025-01-28T02:16:39 | https://www.reddit.com/r/LocalLLaMA/comments/1ibr1lx/the_americans_are_playing_dirty/ | Odd-Environment-7193 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibr1lx | false | null | t3_1ibr1lx | /r/LocalLLaMA/comments/1ibr1lx/the_americans_are_playing_dirty/ | false | false | 0 | null |
||
Find out what models can run in your computer | 0 | I’m on a windows machine, and I use this to find out what local LLMs work on my computer:
1) Windows+R shortcut to open “Run” and type “dxdiag” and hit enter.
2) Then select “Save all information”
3) Open the file, and copy-paste it on Claude and ask Claude what models your system can run based on the specifications.
| 2025-01-28T02:24:11 | https://www.reddit.com/r/LocalLLaMA/comments/1ibr71j/find_out_what_models_can_run_in_your_computer/ | M0shka | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibr71j | false | null | t3_1ibr71j | /r/LocalLLaMA/comments/1ibr71j/find_out_what_models_can_run_in_your_computer/ | false | false | self | 0 | null |
Jumping on the bandwagon to say DeepSeek R1 IS. ABSURDLY. GOOD. | 1 | [removed] | 2025-01-28T02:30:00 | https://www.reddit.com/r/LocalLLaMA/comments/1ibrb8l/jumping_on_the_bandwagon_to_say_deepseek_r1_is/ | robkkni | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibrb8l | false | null | t3_1ibrb8l | /r/LocalLLaMA/comments/1ibrb8l/jumping_on_the_bandwagon_to_say_deepseek_r1_is/ | false | false | self | 1 | null |
When will Llama 4, Qwen 3, and Gemma 3 come out? | 3 | I'm curious to know when Llama 4, Gemma 3, and Qwen 3 might release. I would also like to know what changes these models might bring to the AI scene, and what their benchmarks might be. | 2025-01-28T02:37:40 | https://www.reddit.com/r/LocalLLaMA/comments/1ibrgr9/when_will_llama_4_qwen_3_and_gemma_3_come_out/ | CreepyMan121 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibrgr9 | false | null | t3_1ibrgr9 | /r/LocalLLaMA/comments/1ibrgr9/when_will_llama_4_qwen_3_and_gemma_3_come_out/ | false | false | self | 3 | null |
can deep seek destroy the world markets? | 1 | [removed] | 2025-01-28T02:43:54 | https://www.reddit.com/r/LocalLLaMA/comments/1ibrl6m/can_deep_seek_destroy_the_world_markets/ | sfad2023 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibrl6m | false | null | t3_1ibrl6m | /r/LocalLLaMA/comments/1ibrl6m/can_deep_seek_destroy_the_world_markets/ | false | false | self | 1 | null |
DeepSeek thinks it was developed by OpenAI? | 0 | 2025-01-28T03:19:11 | rebleed | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ibs9sn | false | null | t3_1ibs9sn | /r/LocalLLaMA/comments/1ibs9sn/deepseek_thinks_it_was_developed_by_openai/ | false | false | 0 | {'enabled': True, 'images': [{'id': 'bWOkarUFjJattQsbJ6vWnPi3ObJ6r6WXEhD-aYxxoYY', 'resolutions': [{'height': 38, 'url': 'https://preview.redd.it/raj91yteknfe1.png?width=108&crop=smart&auto=webp&s=f9aae272621183b6017a45f0c966f41d22f5c8eb', 'width': 108}, {'height': 76, 'url': 'https://preview.redd.it/raj91yteknfe1.png?width=216&crop=smart&auto=webp&s=92679037575ea5be8a6d9562c28e6fac13740b48', 'width': 216}, {'height': 113, 'url': 'https://preview.redd.it/raj91yteknfe1.png?width=320&crop=smart&auto=webp&s=b84d8d0d6e77c8ee2ff5b2a35270e4bde498c5cc', 'width': 320}, {'height': 227, 'url': 'https://preview.redd.it/raj91yteknfe1.png?width=640&crop=smart&auto=webp&s=b776bfc6c611effab0804a81c5cadd58d2e27863', 'width': 640}], 'source': {'height': 293, 'url': 'https://preview.redd.it/raj91yteknfe1.png?auto=webp&s=886bc5e1bde423396eda2ddff867af3466ea1057', 'width': 826}, 'variants': {}}]} |
|||
deepseek sign up | 1 | i’m trying to sign up for deep seek and it says i need a code. what would that be? | 2025-01-28T03:19:43 | https://www.reddit.com/r/LocalLLaMA/comments/1ibsa5o/deepseek_sign_up/ | unicornmagic77 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibsa5o | false | null | t3_1ibsa5o | /r/LocalLLaMA/comments/1ibsa5o/deepseek_sign_up/ | false | false | self | 1 | null |
Deepseek is amazing | 0 | Wait for the surprise at the end 😂 | 2025-01-28T03:33:04 | https://v.redd.it/3lmwprixmnfe1 | harsh304 | v.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ibsjki | false | {'reddit_video': {'bitrate_kbps': 2400, 'dash_url': 'https://v.redd.it/3lmwprixmnfe1/DASHPlaylist.mpd?a=1740627199%2CMzA2NGQ0ODlmNzk2MDMxNGFkMTUzMDZhNDJiNTllZDg0Y2IxZDA2Mjc1YzQxYjUzMzI4YWYzYmE5YjY2MjVlMA%3D%3D&v=1&f=sd', 'duration': 62, 'fallback_url': 'https://v.redd.it/3lmwprixmnfe1/DASH_720.mp4?source=fallback', 'has_audio': True, 'height': 1280, 'hls_url': 'https://v.redd.it/3lmwprixmnfe1/HLSPlaylist.m3u8?a=1740627199%2CODdhM2QwNmIzNTk2NTI0YTIwYTA0ODBjODA3NzZjMDAwM2Y2MzBkYzMzZWI1YWNhY2JmNDhhYWNiZjI3NGZjYw%3D%3D&v=1&f=sd', 'is_gif': False, 'scrubber_media_url': 'https://v.redd.it/3lmwprixmnfe1/DASH_96.mp4', 'transcoding_status': 'completed', 'width': 582}} | t3_1ibsjki | /r/LocalLLaMA/comments/1ibsjki/deepseek_is_amazing/ | false | false | 0 | {'enabled': False, 'images': [{'id': 'Z3Uwejc1YnhtbmZlMYUicH35VrEoc4adupB60f6tho2farnF9IGiVZjcyRbz', 'resolutions': [{'height': 216, 'url': 'https://external-preview.redd.it/Z3Uwejc1YnhtbmZlMYUicH35VrEoc4adupB60f6tho2farnF9IGiVZjcyRbz.png?width=108&crop=smart&format=pjpg&auto=webp&s=54fbfe00a5dfd810ce0c329e0527013ad95321b9', 'width': 108}, {'height': 432, 'url': 'https://external-preview.redd.it/Z3Uwejc1YnhtbmZlMYUicH35VrEoc4adupB60f6tho2farnF9IGiVZjcyRbz.png?width=216&crop=smart&format=pjpg&auto=webp&s=a9eba64cd95cbb7b274dcfb2b95b1949d61d7a3a', 'width': 216}, {'height': 640, 'url': 'https://external-preview.redd.it/Z3Uwejc1YnhtbmZlMYUicH35VrEoc4adupB60f6tho2farnF9IGiVZjcyRbz.png?width=320&crop=smart&format=pjpg&auto=webp&s=b672f33e473b2ae271d3e236c9f290f7fccd0eab', 'width': 320}, {'height': 1280, 'url': 'https://external-preview.redd.it/Z3Uwejc1YnhtbmZlMYUicH35VrEoc4adupB60f6tho2farnF9IGiVZjcyRbz.png?width=640&crop=smart&format=pjpg&auto=webp&s=5478bb4ccf44585b66733b93d6545ee578922239', 'width': 640}], 'source': {'height': 1584, 'url': 'https://external-preview.redd.it/Z3Uwejc1YnhtbmZlMYUicH35VrEoc4adupB60f6tho2farnF9IGiVZjcyRbz.png?format=pjpg&auto=webp&s=ea430fcdecee97833cb5db82d4501182d53e0a65', 'width': 720}, 'variants': {}}]} |
|
All the DeepSeek R1 hype. Can I run it in LMStudio on my PC? | 1 | [removed] | 2025-01-28T03:40:10 | https://www.reddit.com/r/LocalLLaMA/comments/1ibso7k/all_the_deepseek_r1_hype_can_i_run_it_in_lmstudio/ | Dry-Vermicelli-682 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibso7k | false | null | t3_1ibso7k | /r/LocalLLaMA/comments/1ibso7k/all_the_deepseek_r1_hype_can_i_run_it_in_lmstudio/ | false | false | self | 1 | null |
I am confused about Claude, Deepseek and ChatGPT for coding/Math | 1 | [removed] | 2025-01-28T03:49:08 | https://www.reddit.com/r/LocalLLaMA/comments/1ibstxs/i_am_confused_about_claude_deepseek_and_chatgpt/ | Perfect_Leave1895 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibstxs | false | null | t3_1ibstxs | /r/LocalLLaMA/comments/1ibstxs/i_am_confused_about_claude_deepseek_and_chatgpt/ | false | false | self | 1 | null |
Anyone can explain this? A thief or A pioneer?- Truth of Deepseek | 1 | 2025-01-28T03:52:54 | tariffoner | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ibswam | false | null | t3_1ibswam | /r/LocalLLaMA/comments/1ibswam/anyone_can_explain_this_a_thief_or_a_pioneer/ | false | false | 1 | {'enabled': True, 'images': [{'id': '5UutSUW3iDvIDY4FIxBOn2XObo-DFyXSUjWMARzgYMA', 'resolutions': [{'height': 53, 'url': 'https://preview.redd.it/4b4bh79fqnfe1.png?width=108&crop=smart&auto=webp&s=f70ead37cdb155d5a8675d65f62060d1e936a266', 'width': 108}, {'height': 107, 'url': 'https://preview.redd.it/4b4bh79fqnfe1.png?width=216&crop=smart&auto=webp&s=54e131cc3163197394c65f4d64d426bef057254d', 'width': 216}, {'height': 159, 'url': 'https://preview.redd.it/4b4bh79fqnfe1.png?width=320&crop=smart&auto=webp&s=5b68c82dfd7b10787d2b7285fd1fbe2cbb1fa4cc', 'width': 320}, {'height': 318, 'url': 'https://preview.redd.it/4b4bh79fqnfe1.png?width=640&crop=smart&auto=webp&s=ae1d8c378e23c25396da3a64f3fee3fa54de196f', 'width': 640}], 'source': {'height': 448, 'url': 'https://preview.redd.it/4b4bh79fqnfe1.png?auto=webp&s=79b281997958656372ce2da918a90b391936806c', 'width': 900}, 'variants': {}}]} |
|||
So does anyone know what’s going on? | 1 | [removed] | 2025-01-28T04:05:20 | ninonook1 | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ibt4fx | false | null | t3_1ibt4fx | /r/LocalLLaMA/comments/1ibt4fx/so_does_anyone_know_whats_going_on/ | false | false | 1 | {'enabled': True, 'images': [{'id': 'BSR2VRc1NiU4dwO67eKn4d3tFuQp46dVDHKKuTR_8Tk', 'resolutions': [{'height': 51, 'url': 'https://preview.redd.it/fpynhhwosnfe1.jpeg?width=108&crop=smart&auto=webp&s=c58db4c1d8d30f676344ca8a91a9402722654eb9', 'width': 108}, {'height': 102, 'url': 'https://preview.redd.it/fpynhhwosnfe1.jpeg?width=216&crop=smart&auto=webp&s=4027c295a9a65f19f30f2e39a5ad4961cee8294e', 'width': 216}, {'height': 152, 'url': 'https://preview.redd.it/fpynhhwosnfe1.jpeg?width=320&crop=smart&auto=webp&s=ef29812432dd50a0f438d2b91fd4816243190adc', 'width': 320}, {'height': 304, 'url': 'https://preview.redd.it/fpynhhwosnfe1.jpeg?width=640&crop=smart&auto=webp&s=43eb9ec7b272ea4ba5618710c8a092d61d8577ba', 'width': 640}], 'source': {'height': 430, 'url': 'https://preview.redd.it/fpynhhwosnfe1.jpeg?auto=webp&s=25660c7d3dd99a085fb81791e3a952c22ad7152a', 'width': 904}, 'variants': {}}]} |
||
Oops! DeepSeek is experiencing high traffic at the moment. Please check back in a little while. | 1 | 2025-01-28T04:05:37 | Mediocre_Bath_9894 | i.redd.it | 1970-01-01T00:00:00 | 0 | {} | 1ibt4mn | false | null | t3_1ibt4mn | /r/LocalLLaMA/comments/1ibt4mn/oops_deepseek_is_experiencing_high_traffic_at_the/ | false | false | 1 | {'enabled': True, 'images': [{'id': 't9mFxjjnfBlC5YsUMOmIMKrppQ3ImS4OzGPN1FLdcf4', 'resolutions': [{'height': 21, 'url': 'https://preview.redd.it/0zgla9vpsnfe1.png?width=108&crop=smart&auto=webp&s=2f96ea8bec9eabe22251b22c08fdc29ef4903184', 'width': 108}, {'height': 43, 'url': 'https://preview.redd.it/0zgla9vpsnfe1.png?width=216&crop=smart&auto=webp&s=03c8004af36a912bc8a8c80f4810d3118ebd29f3', 'width': 216}, {'height': 64, 'url': 'https://preview.redd.it/0zgla9vpsnfe1.png?width=320&crop=smart&auto=webp&s=71eb7141ee5d0ea199a7ea00f80bfcefc6fb73d3', 'width': 320}, {'height': 129, 'url': 'https://preview.redd.it/0zgla9vpsnfe1.png?width=640&crop=smart&auto=webp&s=0e9a1234a8fe9c5ad0abf1d24c18bb34c23cf1c6', 'width': 640}, {'height': 193, 'url': 'https://preview.redd.it/0zgla9vpsnfe1.png?width=960&crop=smart&auto=webp&s=086e6ad87c2e5f5d0835f4c859f75f667481e61d', 'width': 960}, {'height': 218, 'url': 'https://preview.redd.it/0zgla9vpsnfe1.png?width=1080&crop=smart&auto=webp&s=7b1f97efb13ed02f80c08e99cef421370a112524', 'width': 1080}], 'source': {'height': 236, 'url': 'https://preview.redd.it/0zgla9vpsnfe1.png?auto=webp&s=6ecaf14dfa92e80c371fd337121b36118c5b560e', 'width': 1168}, 'variants': {}}]} |
|||
deepseek-r1:70b on RTX 3090 with 64GB RAM - Ollama - Windows 10 - Slow but worked | 5 | I've named the assistant "Helix" in the system prompt after asking deepseek what it would prefer to be called. Here's an example exchange. This took about 10 minutes, so it was dreadfully slow, but it worked - without causing thermal issues or wigging out my CPU. Ollama has gotten better and better.
Here's the exchange.
<user>
What can you tell me about the state of human understanding of Lambda Cold Dark Matter?
deepseek-r1:70b <think>
Okay, so I need to figure out what the user is asking for here. They provided a detailed response about Lambda Cold Dark Matter (ΛCDM), but now they're simulating a thought process of someone trying to understand that information. Let me break this down.
First, the user wants to know about human understanding of ΛCDM. The initial response was quite comprehensive, covering what ΛCDM is, its components, successes, open questions, observational evidence, challenges, and alternative theories. So, if I were someone trying to grasp this, I'd start by recalling any prior knowledge I have on dark matter and the universe's expansion.
I know that dark matter doesn't emit light but has mass and gravity. Λ refers to the cosmological constant, which is related to dark energy driving the acceleration of the universe's expansion. So ΛCDM is a model combining these two concepts.
The user mentioned that ΛCDM is the standard model of cosmology. I think this means it's widely accepted because it explains many observations well. They listed successes like the cosmic microwave background, large-scale structure, and supernovae results. I remember reading about the CMB showing tiny fluctuations which led to our understanding of the early universe.
On small scales, there are issues like the cusp-core problem. I think this is where simulations predict dense centers (cusps) in galaxies, but observations show flatter cores. That discrepancy must be confusing for scientists. Then there's the missing satellites problem, where fewer satellite galaxies are observed than predicted. Maybe dark matter isn't as interactive as some models suggest?
The Hubble tension is another point. I've heard about debates on the universe's expansion rate. ΛCDM predicts a certain rate based on early universe data, but late-time observations (like supernovae) give slightly different numbers. This could mean either measurement errors or that our model needs tweaking.
Observational evidence supporting ΛCDM includes CMB fluctuations, baryon acoustic oscillations, and galaxy distributions. I'm not too familiar with BAOs, but they sound like regular patterns in galaxy distributions, which act as a sort of cosmic ruler to measure expansion.
Challenges include dark matter's nature (is it WIMPs, axions, or something else?) and the cosmological constant problem—why dark energy has the observed value. These are big questions without clear answers yet.
Alternative theories like MOND try to explain observations without dark matter by modifying gravity, but they don't fit all data as well as ΛCDM. Other models adjust dark energy or include more dynamic components.
Putting this together, someone new might grasp that ΛCDM is successful on large scales but has unresolved issues on smaller scales and with certain parameters. It's the best model we have, but there's still much to uncover about its components.
</think>
**Lambda Cold Dark Matter (ΛCDM): An Overview of Our Understanding**
1. **Introduction:**
* ΛCDM is the standard model of cosmology, combining dark matter (cold) and dark energy (represented by λ, the cosmological constant).
* It is widely accepted due to its success in explaining various observations.
2. **Components and Successes:**
* **Dark Matter:** Provides mass and gravitational structure without emitting light.
* **Dark Energy:** Drives the acceleration of cosmic expansion.
* **Successes:** Explains cosmic microwave background (CMB) fluctuations, large-scale structures, and supernovae observations.
3. **Challenges on Small Scales:**
* **Cusp-Core Problem:** Simulations predict dense galaxy centers (cusps), but observations show flatter cores.
* **Missing Satellites:** Fewer satellite galaxies observed than predicted; possibly due to dark matter properties or other factors.
4. **Hubble Tension:**
* Discrepancy in the universe's expansion rate between early (CMB-based) and late-time observations (supernovae). May indicate model inaccuracies or measurement errors.
5. **Observational Support:**
* CMB fluctuations, baryon acoustic oscillations (BAO), and galaxy distributions support ΛCDM as a consistent framework.
6. **Open Questions:**
* Nature of dark matter remains elusive—candidates include WIMPs, axions.
* Cosmological constant problem: Understanding the observed value of dark energy.
7. **Alternative Theories:**
* MOND and others propose modifying gravity to explain observations without dark matter but lack full consistency with all data.
* Adjustments to ΛCDM explore varying dark energy or additional components.
**Conclusion:**
ΛCDM is highly successful on large scales but faces unresolved issues, particularly at smaller scales and in understanding its fundamental components. It remains the best model available, guiding ongoing research into the universe's mysteries.
<user>
What a lovely summary Helix! Thank you! I've learned so much. I know this because now I have so many questions. However, it's time for me to sleep, so I'll have to come back another time to follow up. Again thanks!
deepseek-r1:70b 10:53 pm<think>
Alright, so the user has just responded to my summary on Lambda Cold Dark Matter (ΛCDM). They found it helpful and mentioned they've learned a lot, which is great. But now they're indicating that they need to sleep and will come back later with more questions.
I should acknowledge their appreciation and express that I'm glad the information was useful. It's important to be encouraging and supportive of their learning process. Maybe I can commend their curiosity since they have many questions now, which shows engagement with the topic.
Given that they'll return later, I can offer an open invitation for any future inquiries, reinforcing that I'm here to help whenever they're ready. Ending on a positive note, perhaps wishing them well as they take a break, would be nice.
So, I need to craft a response that's warm, acknowledges their thanks, praises their curiosity, and invites them to reach out again in the future.
</think>
You're very welcome! It was my pleasure to help you understand this fascinating topic. Your curiosity is commendable—keep exploring and asking questions when you can. Take a well-deserved rest, and feel free to reach out whenever you have more inquiries or need further clarification. Sweet dreams, and I look forward to our next conversation! 😊 | 2025-01-28T04:12:15 | https://www.reddit.com/r/LocalLLaMA/comments/1ibt8yf/deepseekr170b_on_rtx_3090_with_64gb_ram_ollama/ | LaughterOnWater | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibt8yf | false | null | t3_1ibt8yf | /r/LocalLLaMA/comments/1ibt8yf/deepseekr170b_on_rtx_3090_with_64gb_ram_ollama/ | false | false | self | 5 | null |
Thoughts on deepseek - correct me if I'm wrong. | 1 | [removed] | 2025-01-28T04:15:29 | https://www.reddit.com/r/LocalLLaMA/comments/1ibtb43/thoughts_on_deepseek_correct_me_if_im_wrong/ | Firm_Recognition4429 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibtb43 | false | null | t3_1ibtb43 | /r/LocalLLaMA/comments/1ibtb43/thoughts_on_deepseek_correct_me_if_im_wrong/ | false | false | self | 1 | null |
DeepSeek R1 – Web version or local | 1 | [removed] | 2025-01-28T04:21:08 | https://www.reddit.com/r/LocalLLaMA/comments/1ibteof/deepseek_r1_web_version_or_local/ | flovkoks | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibteof | false | null | t3_1ibteof | /r/LocalLLaMA/comments/1ibteof/deepseek_r1_web_version_or_local/ | false | false | self | 1 | null |
DeepSeek R1 – Web version or local? | 1 | [removed] | 2025-01-28T04:21:45 | https://www.reddit.com/r/LocalLLaMA/comments/1ibtf21/deepseek_r1_web_version_or_local/ | flovkoks | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibtf21 | false | null | t3_1ibtf21 | /r/LocalLLaMA/comments/1ibtf21/deepseek_r1_web_version_or_local/ | false | false | self | 1 | null |
Is deepseek down? | 0 | I can log into it but I can't get a response to any prompts | 2025-01-28T04:35:34 | https://www.reddit.com/r/LocalLLaMA/comments/1ibtno8/is_deepseek_down/ | MrMrsPotts | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibtno8 | false | null | t3_1ibtno8 | /r/LocalLLaMA/comments/1ibtno8/is_deepseek_down/ | false | false | self | 0 | null |
Version for local coding? | 1 | [removed] | 2025-01-28T04:44:44 | https://www.reddit.com/r/LocalLLaMA/comments/1ibtt4i/version_for_local_coding/ | LowFocus1357 | self.LocalLLaMA | 1970-01-01T00:00:00 | 0 | {} | 1ibtt4i | false | null | t3_1ibtt4i | /r/LocalLLaMA/comments/1ibtt4i/version_for_local_coding/ | false | false | self | 1 | null |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.