chosen
int64 353
41.8M
| rejected
int64 287
41.8M
| chosen_rank
int64 1
2
| rejected_rank
int64 2
3
| top_level_parent
int64 189
41.8M
| split
large_stringclasses 1
value | chosen_prompt
large_stringlengths 236
19.5k
| rejected_prompt
large_stringlengths 209
18k
|
---|---|---|---|---|---|---|---|
1,894,644 | 1,894,570 | 1 | 3 | 1,894,135 | train | <story><title>A zombie keyboard, an app-store rejection, a call from Steve Jobs</title><url>http://blog.cascadesoft.net/2010/10/31/a-zombie-keyboard-an-app-store-rejection-a-call-from-steve-jobs-and-the-economy-for-ipad-app/</url></story><parent_chain><item><author>viraptor</author><text>I don't get it. The summary is that the guy submitted an application, got rejected, appealed, the appeal process seemed to take too long, he emailed Jobs, Jobs phoned him and told him and "reiterated" (from the post) the app store rules until the guy decided he won't achieve anything.<p>I'm amazed at how Jobs telling the guy the same thing that he already knew (telling him 2 times actually) somehow made him go to a paragraph about Jobs with a well deserved opinion of quality products.<p>He still couldn't release the software, he got a "no go" for fixing the situation the way he could, he got absolutely no information about whether the issue will be fixed in the future or not. But that's ok, since Jobs called him? Seriously?</text></item></parent_chain><comment><author>mechanical_fish</author><text>Humans like talking to humans.<p>Humans especially like talking to humans in authority. If you're going to have to play by a rule, it's good to know that the rule is real, and not an imaginary rule made up by a fifth-level bureaucrat on a whim, or because they're looking for a bribe, or because they don't understand the company policy, or because you're small fry and only big players get to break all the rules.<p>And humans like consistency, closure, and clear boundaries. The hardest thing is not necessarily abiding by the rules: It's trying to avoid abiding by rules that aren't really there. The easiest way to ensure that you're pushing the envelope appropriately is to press yourself right up against the edge, and the best edges to push against are the ones that are firm, so that you don't keep having to second-guess yourself, or waking up to discover that your competition is getting away with more than you are because of some uncontrollable factor.</text></comment> | <story><title>A zombie keyboard, an app-store rejection, a call from Steve Jobs</title><url>http://blog.cascadesoft.net/2010/10/31/a-zombie-keyboard-an-app-store-rejection-a-call-from-steve-jobs-and-the-economy-for-ipad-app/</url></story><parent_chain><item><author>viraptor</author><text>I don't get it. The summary is that the guy submitted an application, got rejected, appealed, the appeal process seemed to take too long, he emailed Jobs, Jobs phoned him and told him and "reiterated" (from the post) the app store rules until the guy decided he won't achieve anything.<p>I'm amazed at how Jobs telling the guy the same thing that he already knew (telling him 2 times actually) somehow made him go to a paragraph about Jobs with a well deserved opinion of quality products.<p>He still couldn't release the software, he got a "no go" for fixing the situation the way he could, he got absolutely no information about whether the issue will be fixed in the future or not. But that's ok, since Jobs called him? Seriously?</text></item></parent_chain><comment><author>Legion</author><text>I suppose it's better than Steve Jobs <i>not</i> calling him.<p>I think the part he appreciates is that his complaint was <i>heard</i>, even if the resolution wasn't to his favor. His complaint could have gone into a forever-unacknowledged black hole instead. That would be far more frustrating.<p>I do, however, agree with you that "having a bug in the SDK and not allowing you to work around it the way <i>they</i> do" is not grounds for talking about how great Apple and Steve are.</text></comment> |
8,140,828 | 8,139,676 | 1 | 3 | 8,139,174 | train | <story><title>Arrow: Better dates and times for Python</title><url>http://crsmithdev.com/arrow/</url></story><parent_chain></parent_chain><comment><author>jessedhillon</author><text>Seems like it&#x27;s written by someone who prefers Ruby or JavaScript -- where there exists already a culture using names which are cute first, even if they are opaque -- over Python. These method naming choices are baffling<p>Arrow.to() converts to a new timezone? And .replace() applies a relative offset!? Replace the hour field of this object with -1 should not return an object having hour=11. Arrow.get() is doing some kind of quadruple duty, neither of which would I describe as &quot;getting.&quot;<p>And what about that class name? Arrow as the name of a package is fine, but what do you expect someone to make of &lt;Arrow [...]&gt; -- what&#x27;s wrong with arrow.DateTime?<p>Great work on making and releasing something, but this API is surprising -- as in, one would be unable to predict how it works. I will continue using python-dateutil</text></comment> | <story><title>Arrow: Better dates and times for Python</title><url>http://crsmithdev.com/arrow/</url></story><parent_chain></parent_chain><comment><author>calpaterson</author><text>Suggestion: rename arrow.now() to arrow.localnow() to make it even clearer that it does not generate utc. I&#x27;ve run into this mistake many times with datetime.now() vs datetime.utcnow()</text></comment> |
24,051,822 | 24,051,451 | 1 | 2 | 24,050,691 | train | <story><title>27-inch iMac gets a major update</title><url>https://www.apple.com/newsroom/2020/08/27-inch-imac-gets-a-major-update/</url></story><parent_chain><item><author>efficax</author><text>the 12&quot; macbook is my favorite laptop of all time. Mine is the 2016 version and it&#x27;s starting to feel dated and I&#x27;m hoping that they rerelease this on the new silicon first. I&#x27;ll be smashing the buy button the day it&#x27;s out.</text></item><item><author>dijit</author><text>It will almost certainly be a low end macbook. In fact, there&#x27;s a non-pro &#x27;macbook&#x27;[0] that already exists which would be perfect, it was discontinued a year ago, but used ultra-low power CPUs. In fact when they were talking about them (prior to release) they thought they would have ARM CPU&#x27;s back then.<p>They are fanless and have a 5W thermal envelope, which fits in with the current A12X too.<p>The major criticisms of the design at the time was: Butterfly keyboard, the Single USB-C port and the speed of the device (Intel CORE-M is truly, truly painful). But for a $600-$700 machine with an ARM CPU? that&#x27;s insanely competitive, and is in-line with the &quot;basic&quot; macbook branding.<p><a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;MacBook_(2015%E2%80%932019)" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;MacBook_(2015%E2%80%932019)</a></text></item><item><author>burlesona</author><text>Solid spec bump. Interesting how much emphasis they put on the camera, speakers, and mic; makes total sense in the age of Zoom.<p>I’d also speculate this means iMac won’t be the first computer getting Apple Silicon. I wonder if it will be the last?<p>What’s the consensus guess now? Perhaps a new MacBook Air with good performance but the real “breakthrough” is &gt; 12 hours battery life?</text></item></parent_chain><comment><author>PopeDotNinja</author><text>I think my favorite computer was my late-2010 MacBook Air. It was such a departure from the big &amp; clunky Windows laptops I&#x27;d owned up until that time. And of course I have fond memories of my first computer, an IBM PC Jr (although I really wanted the 2nd disk drive, and copying floppy disks with only one drive &amp; 128kb of RAM was suboptimal).</text></comment> | <story><title>27-inch iMac gets a major update</title><url>https://www.apple.com/newsroom/2020/08/27-inch-imac-gets-a-major-update/</url></story><parent_chain><item><author>efficax</author><text>the 12&quot; macbook is my favorite laptop of all time. Mine is the 2016 version and it&#x27;s starting to feel dated and I&#x27;m hoping that they rerelease this on the new silicon first. I&#x27;ll be smashing the buy button the day it&#x27;s out.</text></item><item><author>dijit</author><text>It will almost certainly be a low end macbook. In fact, there&#x27;s a non-pro &#x27;macbook&#x27;[0] that already exists which would be perfect, it was discontinued a year ago, but used ultra-low power CPUs. In fact when they were talking about them (prior to release) they thought they would have ARM CPU&#x27;s back then.<p>They are fanless and have a 5W thermal envelope, which fits in with the current A12X too.<p>The major criticisms of the design at the time was: Butterfly keyboard, the Single USB-C port and the speed of the device (Intel CORE-M is truly, truly painful). But for a $600-$700 machine with an ARM CPU? that&#x27;s insanely competitive, and is in-line with the &quot;basic&quot; macbook branding.<p><a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;MacBook_(2015%E2%80%932019)" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;MacBook_(2015%E2%80%932019)</a></text></item><item><author>burlesona</author><text>Solid spec bump. Interesting how much emphasis they put on the camera, speakers, and mic; makes total sense in the age of Zoom.<p>I’d also speculate this means iMac won’t be the first computer getting Apple Silicon. I wonder if it will be the last?<p>What’s the consensus guess now? Perhaps a new MacBook Air with good performance but the real “breakthrough” is &gt; 12 hours battery life?</text></item></parent_chain><comment><author>thewhitetulip</author><text>Mine is a Macbook pro 2012 and it is still phenomenal.<p>2yrs ago, my screen used to flicker. I contacted everyone and they said &#x27;change motherboard&#x27; and Mac support people were like buy a new laptop. so I kept the laptop in cupboard for a few months and I open it to check if it works, had decided to sell it and buy a new one and viola, it was working perfectly fine.<p>~5yrs with no issues.</text></comment> |
38,967,023 | 38,962,873 | 1 | 2 | 38,961,080 | train | <story><title>Pixar to undergo 20% layoffs in 2024</title><url>https://techcrunch.com/2024/01/11/as-disney-pushes-towards-streaming-profitability-pixar-to-undergo-layoffs-in-2024/</url></story><parent_chain><item><author>infotainment</author><text>This is a classic layoff debate -- is it better to do it with zero warning, or is it better to let employees know that a layoff is coming?<p>Personally, I believe advance notice is better, because it signals to employees that, if layoffs are going to happen, there will be some time to prepare. By choosing to blindside employees, it creates a culture of paranoia, IMO. (&quot;Will today be layoff day?&quot; &quot;What about next month?&quot; &quot;Oh no, it&#x27;s the end of the quarter, does that mean tomorrow my laptop will lock up at 4am?&quot;)</text></item><item><author>wonderwonder</author><text>Imagine working somewhere and then reading &quot;The studio stressed the layoffs are not imminent, but will take place later this year as Pixar focuses on making less content&quot;. Then having to go to work each day at a place that keeps saying we are all family here knowing that 2&#x2F;10 of you are going to get axed. I have to think they will lose at least 10% just through attrition. Although for many of those people I would imagine they have landed their dream job; feel bad for them.</text></item></parent_chain><comment><author>John23832</author><text>Immediate layoff, long period of transition.<p>What Pixar has created is an environment where nobody knows is they’re going to stay, so people self select into two buckets: 1) those who say “I’m not going to bust my ass just to get fired”. They coast and prepare to leave. And 2) those who work their ass off to try to “prove” themselves. If any of 2 get fired, it’s a bad morale hit for anyone around after the layoffs. You’ll lose actual talent this way.<p>Make the decision of who to layoff and do it, but be kind as you do.</text></comment> | <story><title>Pixar to undergo 20% layoffs in 2024</title><url>https://techcrunch.com/2024/01/11/as-disney-pushes-towards-streaming-profitability-pixar-to-undergo-layoffs-in-2024/</url></story><parent_chain><item><author>infotainment</author><text>This is a classic layoff debate -- is it better to do it with zero warning, or is it better to let employees know that a layoff is coming?<p>Personally, I believe advance notice is better, because it signals to employees that, if layoffs are going to happen, there will be some time to prepare. By choosing to blindside employees, it creates a culture of paranoia, IMO. (&quot;Will today be layoff day?&quot; &quot;What about next month?&quot; &quot;Oh no, it&#x27;s the end of the quarter, does that mean tomorrow my laptop will lock up at 4am?&quot;)</text></item><item><author>wonderwonder</author><text>Imagine working somewhere and then reading &quot;The studio stressed the layoffs are not imminent, but will take place later this year as Pixar focuses on making less content&quot;. Then having to go to work each day at a place that keeps saying we are all family here knowing that 2&#x2F;10 of you are going to get axed. I have to think they will lose at least 10% just through attrition. Although for many of those people I would imagine they have landed their dream job; feel bad for them.</text></item></parent_chain><comment><author>tiffanyh</author><text>In my experience, your best talent then leaves. Which is the opposite of what you want.</text></comment> |
2,684,476 | 2,684,604 | 1 | 2 | 2,684,254 | train | <story><title>Why a*a*a*a*a*a cannot be optimized to (a*a*a)*(a*a*a)?</title><url>http://stackoverflow.com/questions/6430448/why-aaaaaa-cannot-be-optimized-to-aaaaaa</url><text></text></story><parent_chain></parent_chain><comment><author>dougws</author><text>When I first read the headline, I definitely thought that was a regular expression which could be "optimized" to a+. I guess I'm more inclined to read "*" as the Kleene operator than as multiplication...</text></comment> | <story><title>Why a*a*a*a*a*a cannot be optimized to (a*a*a)*(a*a*a)?</title><url>http://stackoverflow.com/questions/6430448/why-aaaaaa-cannot-be-optimized-to-aaaaaa</url><text></text></story><parent_chain></parent_chain><comment><author>kaiwetzel</author><text>In numerical analysis class <i>numerical stability</i> was a major theme[1]. It was very definitely eye-opening how quickly rounding/truncation errors can bite you in seemingly trivial situations!<p>On the other hand, if I have a function that works on parameters in a specified range (say, 0.0 to 1.0) and returns results which is supposed to be correct to a specified accuracy I would <i>love</i> to have the compiler do all the possible optimizations without having to specify a compiler flag (doing so for, say, just a single function can be quite annoying!). Maybe approaches like Haskell's type inference will, eventually, produce significantly faster code because they can do things like this?<p>In a somewhat related case, it would be awesome if the compiler recognized properties like associativeness, distributiveness, commutitiveness of composite functions and rearange things optimally for performance. It's really nice to see so much development in languages and compilers at the moment (llvm, javascript, functional languages like Haskell, etc.) :°)<p>[1] <a href="http://en.wikipedia.org/wiki/Numerical_stability" rel="nofollow">http://en.wikipedia.org/wiki/Numerical_stability</a></text></comment> |
31,630,905 | 31,630,048 | 1 | 2 | 31,626,049 | train | <story><title>Xerox PARC’s engineers on how they invented the future and Xerox lost it (1985)</title><url>https://spectrum.ieee.org/xerox-parc</url></story><parent_chain><item><author>TheOtherHobbes</author><text>If Parc had failed at dissemination we&#x27;d all still be using the command line.<p>The windows&#x2F;icons&#x2F;mouse&#x2F;menus paradigm disseminated just fine. It wasn&#x27;t necessarily completely original - see also, Mother of All Demos - but they showed it could be implemented in real systems with real usability benefits.<p>Laser printers and networking also disseminated just fine.<p>What didn&#x27;t disseminate was the development environment. That&#x27;s not necessarily a surprise, because an environment that delights creative people with PhDs isn&#x27;t going to translate well to the general public.<p>Ultimately Smalltalk, MESA, etc were like concept cars. You could admire them and learn from them, but they were never going to be viable as mainstream products.<p>Windows and Mac filled the gap by producing dumbed-down and overcomplicated versions of the original vision, which - most importantly - happened to be much cheaper.<p>Xerox get a lot of criticism for missing the potential, but it&#x27;s easy to forget that in the 70s word processors typically cost five figures, and minicomputers equivalent to the Dorado cost six figures.<p>Maybe the future would have been different if someone had said &quot;We need to take these ideas and work out how to build them as cheaply as possible.&quot;<p>But realistically commodified computing in the 70s ran on Z80s and still cost five figures for a business system with a hard drive.<p>The technology to make a cheaper version didn&#x27;t exist until the 80s.<p>The problem since Parc is more that there has been no equivalent concentration of smart, creative, <i>playful</i> people. Technology culture changed from playful exploration to hustle and &quot;engagement&quot;, there&#x27;s less greenfield space to explore, and - IMO - there are very few people in the industry who have the raw IQ and the creativity to create a modern equivalent.<p>It&#x27;s pretty much the opposite problem. Instead of exploring tech without obvious business goals, business goals are so tightly enforced it&#x27;s very hard to imagine what would happen if completely open exploration was allowed again.</text></item><item><author>Daub</author><text>The final stage of the creative process is dissemination, and this is what failed Xerox Parc. This is also what failed Kodak, when they failed to capitalise on the digital camera, which they had invented.<p>The core issue is twofold:<p>1. Entrenched interests within the company. When Microsoft tried to develop their mobile strategy, the head of the Excel division deliberately made the mobile Excel experience a crappy one as he had no faith in Mobile to begin with.<p>2. Framing the new product within old paradigms. Kodak had it all: a head start on digital image capture, a partnership with Apply (with the QuickTake 100) and shot tons of money. But despite developing a web-based dissemination platform for digital photos, they could not foresee the rise of social sharing. This was a tragedy, as their Box Brownie (released 1900) had the same effect on the photo industry of the time as social sharing would have on the photo industry. But Kodak was fixated on the mistaken notion that consumers would want to print their &#x27;Kodak moment&#x27;.</text></item></parent_chain><comment><author>cmrdporcupine</author><text>The thing is that Xerox mgmt themselves refused to consider downsizing the ideas developed internally towards commodity hardware that people could afford. They were interested only in high margin office systems. So other people did it. Not a bad thing for the industry, but bad for Xerox.<p>Most famously, the team @ Apple around the Lisa and the Mac. But also Lee Lorenzen was an employee at Xerox in Texas. In 1982 he tried to pitch management on a port of their Star concepts to 8080 CP&#x2F;M class hardware, to get it into people&#x27;s hands. Demo footage here, remember this is before the Mac came out [0]. He was rebuffed, and left Xerox to join Gary Kildall @ Digital Research, where they created the GEM GUI (which became the OS for the Atari ST and the basis of Lorenzen&#x27;s next venture, the successful DTP program, Ventura Publisher) [1]. Xerox was never able to penetrate the consumer level DTP market, but this was a <i>big</i> emerging business in the 80s which Xerox would have been a natural fit for.<p>[0] <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=EMBGRZftS30" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=EMBGRZftS30</a><p>[1] <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=6EeOanSInjo" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=6EeOanSInjo</a></text></comment> | <story><title>Xerox PARC’s engineers on how they invented the future and Xerox lost it (1985)</title><url>https://spectrum.ieee.org/xerox-parc</url></story><parent_chain><item><author>TheOtherHobbes</author><text>If Parc had failed at dissemination we&#x27;d all still be using the command line.<p>The windows&#x2F;icons&#x2F;mouse&#x2F;menus paradigm disseminated just fine. It wasn&#x27;t necessarily completely original - see also, Mother of All Demos - but they showed it could be implemented in real systems with real usability benefits.<p>Laser printers and networking also disseminated just fine.<p>What didn&#x27;t disseminate was the development environment. That&#x27;s not necessarily a surprise, because an environment that delights creative people with PhDs isn&#x27;t going to translate well to the general public.<p>Ultimately Smalltalk, MESA, etc were like concept cars. You could admire them and learn from them, but they were never going to be viable as mainstream products.<p>Windows and Mac filled the gap by producing dumbed-down and overcomplicated versions of the original vision, which - most importantly - happened to be much cheaper.<p>Xerox get a lot of criticism for missing the potential, but it&#x27;s easy to forget that in the 70s word processors typically cost five figures, and minicomputers equivalent to the Dorado cost six figures.<p>Maybe the future would have been different if someone had said &quot;We need to take these ideas and work out how to build them as cheaply as possible.&quot;<p>But realistically commodified computing in the 70s ran on Z80s and still cost five figures for a business system with a hard drive.<p>The technology to make a cheaper version didn&#x27;t exist until the 80s.<p>The problem since Parc is more that there has been no equivalent concentration of smart, creative, <i>playful</i> people. Technology culture changed from playful exploration to hustle and &quot;engagement&quot;, there&#x27;s less greenfield space to explore, and - IMO - there are very few people in the industry who have the raw IQ and the creativity to create a modern equivalent.<p>It&#x27;s pretty much the opposite problem. Instead of exploring tech without obvious business goals, business goals are so tightly enforced it&#x27;s very hard to imagine what would happen if completely open exploration was allowed again.</text></item><item><author>Daub</author><text>The final stage of the creative process is dissemination, and this is what failed Xerox Parc. This is also what failed Kodak, when they failed to capitalise on the digital camera, which they had invented.<p>The core issue is twofold:<p>1. Entrenched interests within the company. When Microsoft tried to develop their mobile strategy, the head of the Excel division deliberately made the mobile Excel experience a crappy one as he had no faith in Mobile to begin with.<p>2. Framing the new product within old paradigms. Kodak had it all: a head start on digital image capture, a partnership with Apply (with the QuickTake 100) and shot tons of money. But despite developing a web-based dissemination platform for digital photos, they could not foresee the rise of social sharing. This was a tragedy, as their Box Brownie (released 1900) had the same effect on the photo industry of the time as social sharing would have on the photo industry. But Kodak was fixated on the mistaken notion that consumers would want to print their &#x27;Kodak moment&#x27;.</text></item></parent_chain><comment><author>Daub</author><text>&gt; If Parc had failed at dissemination we&#x27;d all still be using the command line.<p>Yes, it was disseminated, but not by Xerox. It was the startups of the time that took the baton.<p>&gt; The problem since Parc is more that there has been no equivalent concentration of smart, creative, playful people.<p>Could not agree more. Play is a very underestimated component of productive thought. But what is play? I would say that it is production without responsibility. It is notable that children use play in order to explore their potential and their place in the world. Which is to say, they use it to grow. Need I say more?</text></comment> |
17,569,369 | 17,569,003 | 1 | 2 | 17,560,845 | train | <story><title>The Food of My Youth</title><url>https://www.nybooks.com/daily/2018/07/09/the-food-of-my-youth/</url></story><parent_chain><item><author>joefourier</author><text>Why are stereotypical American &quot;low-income&quot; foods largely processed meats and junk food (e.g. boxed macaroni and cheese, spam, Vienna sausages), instead of staples like rice, oats, beans, lentils, etc. which also keep easily but are far more nutritious, tasty and just as cheap, if not cheaper on a per-calorie basis?<p>This is genuine curiosity, as I grew up in a region where the average income was much lower than the USA, but the food consumed by lower-income individuals was much healthier and fresher. We&#x27;d eat things like red beans and rice, curries, acar (a type of spiced pickle) and often cook meat over wood fires to save money on gas for the oven.</text></item></parent_chain><comment><author>dbatten</author><text>1) Cheap junk food is marketed, whereas cheap staples aren&#x27;t. Basic white&#x2F;brown rice is a commodity. There&#x27;s no differentiation, and ultra-slim margins, so nobody&#x27;s trying to market it to you. Easy Mac on the other hand...<p>2) Taste. To us adults (and mostly upper-middle class adults to boot) on HN, rice and beans may sound more appealing than cheap mac and cheese with mystery meat. Take a poll of 3-year-olds and you may get a different result.<p>3) Ease. Staples like rice and beans are only <i>really</i> cheap when you&#x27;re buying them in dry form, bulk. This means you need to season them and cook them, which can be a long process. (My wife and I recently tried to save on black beans by cooking ourselves rather than buying canned. The result was good, but by the time you chop onions for seasoning, add other ingredients, monitor the beans, drain and cool the beans, and wash the crock pot, cutting board, knife, various spoons and measuring implements, etc., well, it ends up being a lot of work.) If you&#x27;re a single parent working two jobs to support your kids, 5 minutes of preparation is meaningfully better than even, say, 20 minutes of preparation.</text></comment> | <story><title>The Food of My Youth</title><url>https://www.nybooks.com/daily/2018/07/09/the-food-of-my-youth/</url></story><parent_chain><item><author>joefourier</author><text>Why are stereotypical American &quot;low-income&quot; foods largely processed meats and junk food (e.g. boxed macaroni and cheese, spam, Vienna sausages), instead of staples like rice, oats, beans, lentils, etc. which also keep easily but are far more nutritious, tasty and just as cheap, if not cheaper on a per-calorie basis?<p>This is genuine curiosity, as I grew up in a region where the average income was much lower than the USA, but the food consumed by lower-income individuals was much healthier and fresher. We&#x27;d eat things like red beans and rice, curries, acar (a type of spiced pickle) and often cook meat over wood fires to save money on gas for the oven.</text></item></parent_chain><comment><author>tbomb</author><text>I think the reason is 2 part.<p>Part 1 being often these &quot;junk&quot; processed foods are cheaper to buy for more quantity. eg. a box of macaroni and cheese can feed a family of 4 on $2USD. Possibly mix in a couple hotdogs which on the cheaper end can cost under $5USD for 16. in the end your feeding a family of 4 dinner for under $5.<p>Part 2 being time. Typically in lower income families (I grew up on the low end of the &quot;middle class&quot;) both parents are working. my mother didn&#x27;t have the time or energy to come home from and 8-10 hour work day and then cook for the next 90 minutes. that same box of macaroni and cheese (with hotdogs) can be made in ~30 minutes with little attention to it. and once my sister and I were old enough to help out, it was something we could do for her.<p>Thats my anecdotal and oversimplified answer anyway :)</text></comment> |
21,065,129 | 21,063,121 | 1 | 3 | 21,062,799 | train | <story><title>OSSU: A path to a free self-taught education in computer science</title><url>https://github.com/ossu/computer-science</url></story><parent_chain></parent_chain><comment><author>ChuckMcM</author><text>As someone who has lived through a big chunk of the &#x27;computer science&#x27; lifecycle arc :-) My perspective is a bit different here.<p>&quot;Programming&quot;, or the skill of writing specifications that can be translated via software into product has come a long way from the 60&#x27;s to the present day. People I meet, interview, and work with, often fall into three broad chunks of the spectrum.<p>At one end there are &#x27;coders&#x27;, who are essentially cooks, they take previously written code, adapt it to their requirements, attach it to third party libraries and ship the end result. They have been essential to the boom in Internet companies for decades because they ship a lot of code and they are relatively inexpensive (with respect to the expected generated revenue) to hire. When their code doesn&#x27;t work as they expect they generally iterate on it using other suggested solutions until they arrive at one that operates the way they need&#x2F;want it to.<p>In the middle are &#x27;engineers&#x27; who build more at the system level and can fill in the gaps with third party software when needed but are also fully capable of generating the required capability starting with a blank screen. When the system doesn&#x27;t work they expect they can analyze it from first principles to get to the root cause of the problem.<p>At the other end are &#x27;scientists&#x27; who think about the problems of the nature of computing. These folks rewrite an algorithm in three different ways to understand how different compute architectures might execute it. Driven by the joy of discovering new insights about how computers work, if something they build doesn&#x27;t work they are delighted because it has illuminated a gap in their understanding that can be productively filled.<p>Different educational settings are useful for addressing the goals of the student, and in my experience those goals will be different depending on where on the spectrum of &#x27;programmer&#x27; they see themselves.</text></comment> | <story><title>OSSU: A path to a free self-taught education in computer science</title><url>https://github.com/ossu/computer-science</url></story><parent_chain></parent_chain><comment><author>tombert</author><text>This is a bit hypocritical coming from a self-taught dropout, but I have kind of grown to dislike a lot of these &quot;learn programming on your own!&quot; things&#x2F;bootcamps&#x2F;courses.<p>This isn&#x27;t because of some idea that it&#x27;s bad to learn programming for fun, but more that I think it&#x27;s kind of reductive to try and squeeze things down to a streamlined lesson to begin with...There&#x27;s a reason college takes four or more years; it takes a long time for these fundamentals to really sink in, and moreover, a lot of the &quot;extra&quot; classes you&#x27;re required to take actually <i>do</i> inform your perspective on a lot of career stuff. For example, I hated taking philosophy classes and thought they were &quot;pointless&quot;, but I recently realized how much they have helped me with logical thinking, and being able to justify decisions I&#x27;m forced to make.<p>If you&#x27;re learning to code just for fun, these things are totally fine and can be incredibly fun, but if you&#x27;re learning to code for a job, <i>please</i> don&#x27;t treat these things as an &quot;alternative&quot; to college. Any kind of self-learning system almost universally requires a huge amount of self-study, probably more than a university, if you want to become any good at this stuff.</text></comment> |
20,342,938 | 20,342,655 | 1 | 2 | 20,341,176 | train | <story><title>KKR has acquired Corel, reportedly for $1B</title><url>https://techcrunch.com/2019/07/02/kkr-has-acquired-corel-including-its-recent-acquisition-parallels-reportedly-for-1b/</url></story><parent_chain><item><author>TheOtherHobbes</author><text>PSP was one of the few products that made Windows productive. It loaded quickly, worked smoothly, and the UI was elegant in a way that the PS UI still isn&#x27;t.<p>Many things about PS still irritate users in a &quot;That&#x27;s just wrong, stupid, and annoyingly poorly thought out, and why do you keep adding pointless new features instead of fixing this?&quot; kind of a way. [1]<p>PSP didn&#x27;t do as much, but a lot of what it did do was more transparent and effortless.<p>When Corel bought it, they turned it into PSP Pro X Plus Special Edition Platinum Millennium Marketing Bullshit Edition. All kinds of cruft appeared, some of it was entertaining, very little of it was useful, none of it was elegant or beautiful.<p>[1] PSP had a &quot;Create new from clipboard&quot; option which created a new image from the clipboard with a single click. Boom. Done. In PS you have to select a new placeholder from a list of all other possible sizes, possibly after some scrolling and selecting, and then manually paste the image into the document. And now you have an image layer and a background layer, which is often not what you want.</text></item><item><author>chao-</author><text>As an adamant aficionado of PSP over Photoshop, I never forgave Corel for buying Jasc and turning PSP into just another CorelDRAW. If I had wanted CorelDRAW, I would have purchased CorelDRAW. I did not. I wanted Paintshop Pro, and I had it, for a time.<p>A decade and a half later, though, my feelings are mostly numb.<p>Edit: Why did I like PSP over PS? At the time, PSP had better support for my tablet, and had hotkeys that made more intuitive sense for me. I did use PS for a while, sparingly from 2010 to 2015. In the long run, I don&#x27;t do art anymore, and mostly just web design, for which my company uses Figma anyway. Occasionally we might contact something from a UX designer who uses Sketch, but we just import it into Figma anyway. Photoshop might as well not exist as far as I am concerned.</text></item><item><author>atombender</author><text>Corel seems like it&#x27;s become a graveyard of old programs I used in the 1990s, like WinZip, PaintShop Pro and, amazingly, WordPerfect. And, of course, Corel Draw.<p>Oh, and this September will be see 40-year anniversary of WordPerfect [1]. (Though WP for DOS only came out in 1982.)<p>I wonder what their strategy is with this acquisition, since there must be some money in there still.<p>[1] <a href="http:&#x2F;&#x2F;www.columbia.edu&#x2F;~em36&#x2F;wpdos&#x2F;chronology.html" rel="nofollow">http:&#x2F;&#x2F;www.columbia.edu&#x2F;~em36&#x2F;wpdos&#x2F;chronology.html</a></text></item></parent_chain><comment><author>GlennS</author><text>Another piece of software that I thought made Windows productive in the 90s was Microsoft Publisher.<p>We got this bundled with...something? Maybe a printer? And I used it a lot for schoolwork. It was clean, neat and powerful. You could use it to layout a page really easily.<p>At some point they stopped selling it to normal people and brought out Microsoft Home Publishing instead. And of course it was complete crap by comparison.<p>I have since discovered that Adobe Illustrator is better than both.</text></comment> | <story><title>KKR has acquired Corel, reportedly for $1B</title><url>https://techcrunch.com/2019/07/02/kkr-has-acquired-corel-including-its-recent-acquisition-parallels-reportedly-for-1b/</url></story><parent_chain><item><author>TheOtherHobbes</author><text>PSP was one of the few products that made Windows productive. It loaded quickly, worked smoothly, and the UI was elegant in a way that the PS UI still isn&#x27;t.<p>Many things about PS still irritate users in a &quot;That&#x27;s just wrong, stupid, and annoyingly poorly thought out, and why do you keep adding pointless new features instead of fixing this?&quot; kind of a way. [1]<p>PSP didn&#x27;t do as much, but a lot of what it did do was more transparent and effortless.<p>When Corel bought it, they turned it into PSP Pro X Plus Special Edition Platinum Millennium Marketing Bullshit Edition. All kinds of cruft appeared, some of it was entertaining, very little of it was useful, none of it was elegant or beautiful.<p>[1] PSP had a &quot;Create new from clipboard&quot; option which created a new image from the clipboard with a single click. Boom. Done. In PS you have to select a new placeholder from a list of all other possible sizes, possibly after some scrolling and selecting, and then manually paste the image into the document. And now you have an image layer and a background layer, which is often not what you want.</text></item><item><author>chao-</author><text>As an adamant aficionado of PSP over Photoshop, I never forgave Corel for buying Jasc and turning PSP into just another CorelDRAW. If I had wanted CorelDRAW, I would have purchased CorelDRAW. I did not. I wanted Paintshop Pro, and I had it, for a time.<p>A decade and a half later, though, my feelings are mostly numb.<p>Edit: Why did I like PSP over PS? At the time, PSP had better support for my tablet, and had hotkeys that made more intuitive sense for me. I did use PS for a while, sparingly from 2010 to 2015. In the long run, I don&#x27;t do art anymore, and mostly just web design, for which my company uses Figma anyway. Occasionally we might contact something from a UX designer who uses Sketch, but we just import it into Figma anyway. Photoshop might as well not exist as far as I am concerned.</text></item><item><author>atombender</author><text>Corel seems like it&#x27;s become a graveyard of old programs I used in the 1990s, like WinZip, PaintShop Pro and, amazingly, WordPerfect. And, of course, Corel Draw.<p>Oh, and this September will be see 40-year anniversary of WordPerfect [1]. (Though WP for DOS only came out in 1982.)<p>I wonder what their strategy is with this acquisition, since there must be some money in there still.<p>[1] <a href="http:&#x2F;&#x2F;www.columbia.edu&#x2F;~em36&#x2F;wpdos&#x2F;chronology.html" rel="nofollow">http:&#x2F;&#x2F;www.columbia.edu&#x2F;~em36&#x2F;wpdos&#x2F;chronology.html</a></text></item></parent_chain><comment><author>andybak</author><text>&gt; PSP had a &quot;Create new from clipboard&quot; option<p>Since forever (probably since PS 2.4 or so) choosing &quot;New Document&quot; in Photoshop defaults to the size of the document on the clipboard. My workflow is: Copy in one app &gt; Switch to PS &gt; CMD + N &gt; Hit Enter &gt; CMD + V</text></comment> |
12,195,372 | 12,195,009 | 1 | 2 | 12,194,106 | train | <story><title>Breakthrough solar cell captures CO2 and sunlight, produces burnable fuel</title><url>https://news.uic.edu/breakthrough-solar-cell-captures-co2-and-sunlight-produces-burnable-fuel</url></story><parent_chain><item><author>Animats</author><text>&quot;Nanotechnology&quot; again. It&#x27;s just surface chemistry, people. It&#x27;s nice that they can do this in the lab, but as usual, the PR is excessive. &quot;The ability to turn CO2 into fuel at a cost comparable to a gallon of gasoline would render fossil fuels obsolete.&quot; At least get to pilot plant stage before issuing statements like that.<p>Especially since UC Berkeley announced a similar breakthrough last year.[1] Wikipedia points out that artificial photosynthesis was first achieved in 1912, and some of the same claims were made back then. There are lots of artificial photosynthesis projects. One of the best was in 2011, the first &quot;artificial leaf&quot;[3]. It ran for 44 hours.<p>The usual questions apply. How efficient is this? Does the catalyst get used up, or crud up with contaminants, and if so, how fast? What limits the life of the system? What are the costs like? (Excessive catalyst cost has been a problem.)
Is this better than all the other groups doing similar work?<p>Artificial photosynthesis may be useful someday, but this probably isn&#x27;t the big breakthrough that makes it a commercial product. That may come, though.<p>[1] <a href="http:&#x2F;&#x2F;newscenter.lbl.gov&#x2F;2015&#x2F;04&#x2F;16&#x2F;major-advance-in-artificial-photosynthesis&#x2F;" rel="nofollow">http:&#x2F;&#x2F;newscenter.lbl.gov&#x2F;2015&#x2F;04&#x2F;16&#x2F;major-advance-in-artifi...</a>
[2] <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Artificial_photosynthesis" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Artificial_photosynthesis</a>
[3] <a href="https:&#x2F;&#x2F;www.acs.org&#x2F;content&#x2F;acs&#x2F;en&#x2F;pressroom&#x2F;newsreleases&#x2F;2011&#x2F;march&#x2F;debut-of-the-first-practical-artificial-leaf.html" rel="nofollow">https:&#x2F;&#x2F;www.acs.org&#x2F;content&#x2F;acs&#x2F;en&#x2F;pressroom&#x2F;newsreleases&#x2F;20...</a></text></item></parent_chain><comment><author>nhebb</author><text>&gt; as usual, the PR is excessive<p>I think articles that over hype lab results make it to the top of HN because there is no downvote button and flagging them doesn&#x27;t seem appropriate. I wish there was a &quot;meh&quot; option.</text></comment> | <story><title>Breakthrough solar cell captures CO2 and sunlight, produces burnable fuel</title><url>https://news.uic.edu/breakthrough-solar-cell-captures-co2-and-sunlight-produces-burnable-fuel</url></story><parent_chain><item><author>Animats</author><text>&quot;Nanotechnology&quot; again. It&#x27;s just surface chemistry, people. It&#x27;s nice that they can do this in the lab, but as usual, the PR is excessive. &quot;The ability to turn CO2 into fuel at a cost comparable to a gallon of gasoline would render fossil fuels obsolete.&quot; At least get to pilot plant stage before issuing statements like that.<p>Especially since UC Berkeley announced a similar breakthrough last year.[1] Wikipedia points out that artificial photosynthesis was first achieved in 1912, and some of the same claims were made back then. There are lots of artificial photosynthesis projects. One of the best was in 2011, the first &quot;artificial leaf&quot;[3]. It ran for 44 hours.<p>The usual questions apply. How efficient is this? Does the catalyst get used up, or crud up with contaminants, and if so, how fast? What limits the life of the system? What are the costs like? (Excessive catalyst cost has been a problem.)
Is this better than all the other groups doing similar work?<p>Artificial photosynthesis may be useful someday, but this probably isn&#x27;t the big breakthrough that makes it a commercial product. That may come, though.<p>[1] <a href="http:&#x2F;&#x2F;newscenter.lbl.gov&#x2F;2015&#x2F;04&#x2F;16&#x2F;major-advance-in-artificial-photosynthesis&#x2F;" rel="nofollow">http:&#x2F;&#x2F;newscenter.lbl.gov&#x2F;2015&#x2F;04&#x2F;16&#x2F;major-advance-in-artifi...</a>
[2] <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Artificial_photosynthesis" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Artificial_photosynthesis</a>
[3] <a href="https:&#x2F;&#x2F;www.acs.org&#x2F;content&#x2F;acs&#x2F;en&#x2F;pressroom&#x2F;newsreleases&#x2F;2011&#x2F;march&#x2F;debut-of-the-first-practical-artificial-leaf.html" rel="nofollow">https:&#x2F;&#x2F;www.acs.org&#x2F;content&#x2F;acs&#x2F;en&#x2F;pressroom&#x2F;newsreleases&#x2F;20...</a></text></item></parent_chain><comment><author>mistercow</author><text>&gt;At least get to pilot plant stage before issuing statements like that.<p>To be fair, making statements like that can help you get to that stage. It sucks that this is how it works, but playing the game is understandable.</text></comment> |
33,416,050 | 33,415,624 | 1 | 2 | 33,414,565 | train | <story><title>The Browser Company’s Darin Fisher thinks it’s time to reinvent the browser</title><url>https://www.theverge.com/2022/10/31/23428862/arc-browser-web-company-darin-fisher</url></story><parent_chain><item><author>Vermeulen</author><text>Reinventing the browser should be simplifying it to only what it needs to be: executing + rendering remote code in a sandboxed window.<p>Make a new browser that is just WebGPU + WebAssembly, no more DOM. Would already run the next wave of games that will come targeting that.<p>Then for regular websites, have them deliver a HTML rendering engine written in Webassembly for how they want to be rendered. No more need to test websites on a range of browsers</text></item></parent_chain><comment><author>paxys</author><text>Great idea, every website should write its own WebGPU-based rendering engine from scratch.<p>But that will be way too much effort for every kind of project, so what can really happen is some large players write engines and define interfaces for them, and we can all use them for our sites.<p>But different engines rendering the same data differently is probably going to be confusing (what if I want to swap them to compare performance?), so we should also standardize on some rules for how they do that. Maybe define some kind of document model and styling language?<p>Also, downloading all these engines on every page load might be too wasteful, so we can pick a couple of the dominant ones and bundle them into the browser itself. That way sites just have to send over structured content and the browser can load them all without a problem.<p>If we can ever get to this it will be a great web experience!</text></comment> | <story><title>The Browser Company’s Darin Fisher thinks it’s time to reinvent the browser</title><url>https://www.theverge.com/2022/10/31/23428862/arc-browser-web-company-darin-fisher</url></story><parent_chain><item><author>Vermeulen</author><text>Reinventing the browser should be simplifying it to only what it needs to be: executing + rendering remote code in a sandboxed window.<p>Make a new browser that is just WebGPU + WebAssembly, no more DOM. Would already run the next wave of games that will come targeting that.<p>Then for regular websites, have them deliver a HTML rendering engine written in Webassembly for how they want to be rendered. No more need to test websites on a range of browsers</text></item></parent_chain><comment><author>wruza</author><text>I don’t think this is a good idea. We’ll run into a situation with many different not-dom guis having the same set of problems that existing not-dom guis had for all these years. Think gnome-like drama for the half of the internet. Dom&#x2F;css is no doubt a mountain of something stinky, but at least it is a slow moving standard and never had any politics or bdfl opinions involved. Also it has a set of features (e.g. accessibility, ad blocking) which would be hard to implement everywhere in the same way. Maybe something more low-level than dom but still integratable with higher-level apis and features could be built into a new browser paradigm.</text></comment> |
13,187,365 | 13,186,899 | 1 | 3 | 13,186,132 | train | <story><title>Recent efforts in V8 and DevTools for Node.js</title><url>http://v8project.blogspot.com/2016/12/v8-nodejs.html</url></story><parent_chain></parent_chain><comment><author>franciscop</author><text>&gt; &quot;Async&#x2F;await will land in Node with the next V8 update.&quot;<p>You can actually try the async-await functionality with Node Current (7.2.1 as of this writing) by passing the flag --harmony-async-await:<p><pre><code> node --harmony-async-await app.js
</code></pre>
It will be available without flags in the LTS release in March (that is in 3 months). Say goodbye to Callback Hell, the future of Node.js is looking pretty good.</text></comment> | <story><title>Recent efforts in V8 and DevTools for Node.js</title><url>http://v8project.blogspot.com/2016/12/v8-nodejs.html</url></story><parent_chain></parent_chain><comment><author>hackcrafter</author><text>I was confused when writing a NodeJs script last night which ES* features were supported in the node runtime I had installed (6.9.x).<p>This blog posts mentioned Node already fully supports ES6.<p>Is there a table anywhere that lists ES6&#x2F;ES7 features and which nodejs version supports it natively?<p>I assume async&#x2F;await (which I love and use from TypeScript) will only be available in Node 7.x and not 6.x.<p>The Node.js home-page makes node 7.x look scary unstable, is there any real-world caveats to using it?</text></comment> |
19,148,042 | 19,147,883 | 1 | 3 | 19,144,053 | train | <story><title>My Life at 47 Is Back to What It Was Like at 27</title><url>https://medium.com/s/meghan-daum/my-life-at-47-is-back-to-what-it-was-like-at-27-eb7a071b3598</url></story><parent_chain></parent_chain><comment><author>IronWolve</author><text>One of the things about getting older and divorced, people use to join organizations at higher rates, now we moved to online communities and an online&#x2F;mobile life.<p>With a fulltime job and you working till death, you don&#x27;t have time to do much. The local eagles&#x2F;lions&#x2F;moose&#x2F;etc club, you at least got to go out and have dinner, chat with people, make friends, etc.<p>I checked out the local Masons lodge, and I was at least 20 years younger so I didn&#x27;t ask to be sponsored. My friend in Ireland joined the Masons, he says the lodges are filled with younger people from 20s and up.<p>I read the &quot;Starting over&quot; and think, did you change your hobbies and activities too? There&#x27;s so much to do now, how can one be bored. Even a trip to the library and parks is cheap and free.</text></comment> | <story><title>My Life at 47 Is Back to What It Was Like at 27</title><url>https://medium.com/s/meghan-daum/my-life-at-47-is-back-to-what-it-was-like-at-27-eb7a071b3598</url></story><parent_chain></parent_chain><comment><author>jsharf</author><text>I think there&#x27;s more to life than settling into a fixed set point. Life is interactive: you can improve your experience by putting in the right kind of effort and getting lucky. For example, the author wrote this article and made a strong connection with a bunch of strangers on the internet. Even if the author doesn&#x27;t feel that way, I do, and that&#x27;s enough for me. Agency is a muscle, exercise it!</text></comment> |
39,758,627 | 39,757,798 | 1 | 2 | 39,755,471 | train | <story><title>Java 22 Released</title><url>https://mail.openjdk.org/pipermail/jdk-dev/2024-March/008827.html</url></story><parent_chain></parent_chain><comment><author>jimbokun</author><text>Maybe my favorite feature in this release:<p><a href="https:&#x2F;&#x2F;openjdk.org&#x2F;jeps&#x2F;463" rel="nofollow">https:&#x2F;&#x2F;openjdk.org&#x2F;jeps&#x2F;463</a><p>Finally solves the inscrutable Hello World program!<p>Yes, it&#x27;s just ergonomics for early beginners. But could be the difference in whether or not someone new to programming sticks with Java or not.</text></comment> | <story><title>Java 22 Released</title><url>https://mail.openjdk.org/pipermail/jdk-dev/2024-March/008827.html</url></story><parent_chain></parent_chain><comment><author>ecshafer</author><text>It isn&#x27;t a &quot;Sexy&quot; PL change, but a full foreign function interface will be a huge change. In my experience, relying on the old java JNI based libraries seems to be one of the biggest things that break in upgrades. So I am hoping this will reduce the maintenance burden of Java.</text></comment> |
12,165,588 | 12,165,012 | 1 | 2 | 12,164,021 | train | <story><title>The Apple Goes Mushy Part I: OS X's Interface Decline</title><url>http://www.nicholaswindsorhoward.com/blog-directory/2016/7/20/the-apple-goes-mushy-part-i</url></story><parent_chain><item><author>Angostura</author><text>&gt; The use of metaphors worked when people didn&#x27;t know what a computer was, and the only way to make its UI make sense was to mimic real-world objects.<p>I disagree strongly. If you were new to a machine, is there <i>anything</i> in the new Photos icon that tells you that it anything to do with images, photos or image manipulation? No.<p>The new icon for Pages is fine by the way,</text></item><item><author>56k</author><text>This article doesn&#x27;t make any sense.<p>The use of metaphors worked when people didn&#x27;t know what a computer was, and the only way to make its UI make sense was to mimic real-world objects. Now, this is no longer necessary. It&#x27;s been 40 years.<p>While I agree that new macOS icons aren&#x27;t great (see the Game Center icon), the old ones were silly. I&#x27;m 35 and I have probably seen an actual contact book only once when I was little. It doesn&#x27;t make sense to have a skeuomorphic contact book as an icon for Contacts. The old icon for Pages? I don&#x27;t even know what that is, I&#x27;m not into calligraphy.<p>The only thing I&#x27;d agree with is hiding UI controls. Apple has been making its apps less usable to make them look pretty, hiding important elements in an attempt to declutter the interface. I hate how Safari hides the full address in the address bar, for instance. Or, how they removed scrollbars and force people to actually scroll every piece of the interface to check if there&#x27;s something more to see, while before you could tell just by looking at scrollbars. Of course, there are settings to go back to the old behavior for both my examples, so power users are fine, but I fail to see how these moves improve things for regular users.<p>I also disagree that Steve Job&#x27;s death was detrimental to macOS&#x27;s UI. He was the one who kept Apple looking outdated with his obsession for skeuomorphism, I&#x27;m glad they went for a flatter look right after his death.<p>Of course, everyone&#x27;s taste is different, but I still think this is a bad article.</text></item></parent_chain><comment><author>yoz-y</author><text>But if you have never seen a camera then you would not know what the old icon means either. Yes there is a picture of a palm tree there but that could be anything (given that all icons are basically pictures of something).<p>Looking at my desk and my habits there are actually very few objects that I use for work most of it happens on screen. All in all, I think good icon design will become harder.</text></comment> | <story><title>The Apple Goes Mushy Part I: OS X's Interface Decline</title><url>http://www.nicholaswindsorhoward.com/blog-directory/2016/7/20/the-apple-goes-mushy-part-i</url></story><parent_chain><item><author>Angostura</author><text>&gt; The use of metaphors worked when people didn&#x27;t know what a computer was, and the only way to make its UI make sense was to mimic real-world objects.<p>I disagree strongly. If you were new to a machine, is there <i>anything</i> in the new Photos icon that tells you that it anything to do with images, photos or image manipulation? No.<p>The new icon for Pages is fine by the way,</text></item><item><author>56k</author><text>This article doesn&#x27;t make any sense.<p>The use of metaphors worked when people didn&#x27;t know what a computer was, and the only way to make its UI make sense was to mimic real-world objects. Now, this is no longer necessary. It&#x27;s been 40 years.<p>While I agree that new macOS icons aren&#x27;t great (see the Game Center icon), the old ones were silly. I&#x27;m 35 and I have probably seen an actual contact book only once when I was little. It doesn&#x27;t make sense to have a skeuomorphic contact book as an icon for Contacts. The old icon for Pages? I don&#x27;t even know what that is, I&#x27;m not into calligraphy.<p>The only thing I&#x27;d agree with is hiding UI controls. Apple has been making its apps less usable to make them look pretty, hiding important elements in an attempt to declutter the interface. I hate how Safari hides the full address in the address bar, for instance. Or, how they removed scrollbars and force people to actually scroll every piece of the interface to check if there&#x27;s something more to see, while before you could tell just by looking at scrollbars. Of course, there are settings to go back to the old behavior for both my examples, so power users are fine, but I fail to see how these moves improve things for regular users.<p>I also disagree that Steve Job&#x27;s death was detrimental to macOS&#x27;s UI. He was the one who kept Apple looking outdated with his obsession for skeuomorphism, I&#x27;m glad they went for a flatter look right after his death.<p>Of course, everyone&#x27;s taste is different, but I still think this is a bad article.</text></item></parent_chain><comment><author>mmustapic</author><text>&gt; is there anything in the new Photos icon that tells you that it anything to do with images, photos or image manipulation<p>Yes, it says &quot;Photos&quot; below it</text></comment> |
25,778,626 | 25,776,916 | 1 | 3 | 25,775,786 | train | <story><title>Dungeon Magazine</title><url>https://archive.org/details/dungeonmagazine?sort=titleSorter</url></story><parent_chain><item><author>Paul_S</author><text>The history of Dungeon and AD&amp;D is an interesting lesson in licensing and IP ownership. Also a lesson in power of brands and marketing.<p>Pathfinder is the real successor to AD&amp;D and Paizo had the subscribers communication channels to fans and yet people remained with WotC and moved on to the new WotC system whose only connection with AD&amp;D was the name. Back when I was a more active DM this puzzled me to no end.</text></item></parent_chain><comment><author>ajross</author><text>Hah, I love when HN strays into obscure RPG industry flame war.<p>See, I think this is wrong. People didn&#x27;t, in fact, stick with D&amp;D at the point where Pathfinder launched. Pathfinder was (relative to the third party RPG market) a huge success. And 4e D&amp;D was objectively a failure in the market (though sure, it sold more than Pathfinder).<p>Where Wizards won was with the 5e rules. And the 5e rules are better than Pathfinder in fundamental ways that lead to higher sales and more player satisfaction.<p>And the reason is that 5e is, really for the first time since the Holmes Basic Set of 1978, <i>accessible to the mass market</i>. Kids like D&amp;D again! Prior to covid, I ran a 2-year campaign for my 10-12 year old son and a bunch of his friends and they <i>loved</i> it. Some of them are hooked hard now. Likewise, couples play 5e; random groups in offices. You can watch people play it on the internet, and apparently that&#x27;s a big thing too. There are communities of women and queer folks plugging a game that used to be associated mostly with unwashed cloistered incels.<p>It&#x27;s a straightforward, easy game that gets out of the way and exposes the actual fun part of in-person RPGs to people who would never have had the patience to read through the tomes full of prestige classes and feats that 3.x&#x2F;Pathfinder require.<p>Basically: with 5e WotC finally got back to the promise we all saw in the early 80&#x27;s. Pathfinder is a fine game, but it&#x27;s really just trying to hang on to a player culture from 2004 that doesn&#x27;t have a lot of growth in it.</text></comment> | <story><title>Dungeon Magazine</title><url>https://archive.org/details/dungeonmagazine?sort=titleSorter</url></story><parent_chain><item><author>Paul_S</author><text>The history of Dungeon and AD&amp;D is an interesting lesson in licensing and IP ownership. Also a lesson in power of brands and marketing.<p>Pathfinder is the real successor to AD&amp;D and Paizo had the subscribers communication channels to fans and yet people remained with WotC and moved on to the new WotC system whose only connection with AD&amp;D was the name. Back when I was a more active DM this puzzled me to no end.</text></item></parent_chain><comment><author>mcv</author><text>Pathfinder 1st edition was a direct successor (near-clone, even) of D&amp;D 3, which was created by WotC. That&#x27;s where the big break from AD&amp;D happened. D&amp;D 5 has about as much in common with AD&amp;D as D&amp;D 3 and Pathfinder have. Perhaps more, because D&amp;D 5 was in some ways (though not all) an explicit attempt to go back to earlier, less complex editions of D&amp;D.</text></comment> |
10,933,499 | 10,933,153 | 1 | 2 | 10,930,194 | train | <story><title>Announcing Minecraft: Education Edition</title><url>http://education.minecraft.net/announce011916/</url></story><parent_chain><item><author>fluxquanta</author><text>&gt;Its a good starting place for showing kids in a familiar environment what they can do with programming.<p>Is it, though? Admittedly it&#x27;s been a while but I feel like once you get beyond the absolute basics of circuitry Minecraft gets ridiculously complicated in terms of building the structures necessary to establish the circuitry (just search YouTube for the monstrosities that are necessary to build simple adders in Minecraft). At that point you&#x27;re not really learning anything except how to play Minecraft.<p>When a student&#x27;s at middle or high school level I feel like LEGO Mindstorms or even Arduinos would be a better option.</text></item><item><author>a2tech</author><text>Many middle schools&#x2F;high schools in my area teach computer science (programming mostly) with Minecraft. Its a good starting place for showing kids in a familiar environment what they can do with programming.</text></item><item><author>fluxquanta</author><text>I&#x27;m very familiar with Minecraft -- I still have the PayPal receipt for when I sent 10 Euro to Mojang to get in on the beta back in 2010 (which seems like ages ago).<p>But, I ask innocently and without agenda, how is this used as a tool for educators? I know kids love it, and I understand why, but quotes like<p>&gt;Since the introduction of Minecraft to the classroom, educators around the world have been using Minecraft to effectively teach students everything from STEM subjects to art and poetry.<p>make me wonder what exactly is the benefit of using Minecraft in the classroom from an educational perspective? Are lessons actually being planned around Minecraft, or is it like the video games I played in &quot;computer lab&quot; which were just a convenient way to get a group of 30 second graders to be quiet for 45 minutes? What&#x27;s the benefit of using Minecraft over traditional teaching tools, other than shoehorning lessons into something kids are already familiar with?<p>I realize that this comment may sound like it&#x27;s coming from a place of bitter skepticism, but I&#x27;m genuinely curious.</text></item></parent_chain><comment><author>Bjartr</author><text>Your comment is spot on for vanilla Minecraft, however this is MinecraftEdu which includes ComputerCraftEdu which has both a tile-based and text-based programming environment (based on Lua). It seems fairly well designed[1].<p>[1] <a href="http:&#x2F;&#x2F;services.minecraftedu.com&#x2F;wiki&#x2F;Teaching_with_ComputerCraftEdu" rel="nofollow">http:&#x2F;&#x2F;services.minecraftedu.com&#x2F;wiki&#x2F;Teaching_with_Computer...</a></text></comment> | <story><title>Announcing Minecraft: Education Edition</title><url>http://education.minecraft.net/announce011916/</url></story><parent_chain><item><author>fluxquanta</author><text>&gt;Its a good starting place for showing kids in a familiar environment what they can do with programming.<p>Is it, though? Admittedly it&#x27;s been a while but I feel like once you get beyond the absolute basics of circuitry Minecraft gets ridiculously complicated in terms of building the structures necessary to establish the circuitry (just search YouTube for the monstrosities that are necessary to build simple adders in Minecraft). At that point you&#x27;re not really learning anything except how to play Minecraft.<p>When a student&#x27;s at middle or high school level I feel like LEGO Mindstorms or even Arduinos would be a better option.</text></item><item><author>a2tech</author><text>Many middle schools&#x2F;high schools in my area teach computer science (programming mostly) with Minecraft. Its a good starting place for showing kids in a familiar environment what they can do with programming.</text></item><item><author>fluxquanta</author><text>I&#x27;m very familiar with Minecraft -- I still have the PayPal receipt for when I sent 10 Euro to Mojang to get in on the beta back in 2010 (which seems like ages ago).<p>But, I ask innocently and without agenda, how is this used as a tool for educators? I know kids love it, and I understand why, but quotes like<p>&gt;Since the introduction of Minecraft to the classroom, educators around the world have been using Minecraft to effectively teach students everything from STEM subjects to art and poetry.<p>make me wonder what exactly is the benefit of using Minecraft in the classroom from an educational perspective? Are lessons actually being planned around Minecraft, or is it like the video games I played in &quot;computer lab&quot; which were just a convenient way to get a group of 30 second graders to be quiet for 45 minutes? What&#x27;s the benefit of using Minecraft over traditional teaching tools, other than shoehorning lessons into something kids are already familiar with?<p>I realize that this comment may sound like it&#x27;s coming from a place of bitter skepticism, but I&#x27;m genuinely curious.</text></item></parent_chain><comment><author>jerf</author><text>It would depend on what this initiative does. I&#x27;m hoping this looks more like an official path to modding-like tools that may not offer the whole complexity of the world, but would allow things like programmatic world generation and possibly interactive entity AI, etc. I would tend to agree that redstone, for all its possibilities, is not really a very good educational mechanic beyond the basics on its own. In Brooks&#x27; terminology, redstone has <i>way</i> too much accidental complexity for what little essential complexity it could expose... it&#x27;s the opposite of what I&#x27;d want in my education. What little time I&#x27;ve spent fiddling with it has involved <i>way</i> too much very, very, very low-level physical routing (and a lot of time wishing I could stick redstone on the <i>side</i> of a block).</text></comment> |
18,618,566 | 18,618,844 | 1 | 2 | 18,618,359 | train | <story><title>Facebook removed post by ex-manager who said site 'failed' black people</title><url>https://www.theguardian.com/technology/2018/dec/04/facebook-mark-s-luckie-african-american-workers-users</url></story><parent_chain></parent_chain><comment><author>aphextron</author><text>Being black in this industry really, really sucks. I&#x27;m certain that for the vast majority of people I interact with on a daily basis, it doesn&#x27;t even cross their minds. But living with an unending fear of having your entire self worth immediately judged at the color of your skin is something that no other race lives with in this industry. I have to <i>immediately</i> force myself to be twice as charming, friendly, witty, and intelligent sounding as the equivalent white or Asian person just to be taken seriously. I don&#x27;t really know that there&#x27;s any solution to this, it&#x27;s just something we have to live with.</text></comment> | <story><title>Facebook removed post by ex-manager who said site 'failed' black people</title><url>https://www.theguardian.com/technology/2018/dec/04/facebook-mark-s-luckie-african-american-workers-users</url></story><parent_chain></parent_chain><comment><author>sonnyblarney</author><text>&quot;Create internal systems for employees to anonymously report microaggressions. &quot;<p>This is the &#x27;happy path&#x27; to ideological totalitarianism, and it&#x27;s a scary statement.<p>In the UC system, it is now considered a &#x27;micro aggression&#x27; to make the statement &#x27;America is a Meritocracy&#x27;. Why? Because it doesn&#x27;t necessarily reflect the fact that for some people it&#x27;s harder than others. Surely - there&#x27;s a lot to be debated about the statement. But that it cannot be said, or is even considered problematic is utterly Orwellian.<p>The author makes the case of &#x27;undue or overly harsh&#x27; criticism of Black employees? Unfortunately, this is a difficult thing to measure, and if FB turns into a &#x27;government office&#x27; - it simply won&#x27;t be possible to give even a fair assessment without the fear or being labelled a racist.<p>I&#x27;m actually quite sensitive to aphxtron&#x27;s comment above about the insecurities of being black in tech, and there is work to do ... but I think the intersectional &#x2F; authoritarian approach, especially those whereby we &#x27;assume racism&#x27; is wrong.<p>I don&#x27;t think the Colin Kaepernick approach is going to work on this, I think it&#x27;s just going to take a while.</text></comment> |
21,482,447 | 21,482,425 | 1 | 2 | 21,481,316 | train | <story><title>Netflix, HBO and Cable Giants Are Coming for Password Sharers</title><url>https://www.bloomberg.com/news/articles/2019-11-08/netflix-hbo-and-cable-giants-are-coming-for-password-cheats</url></story><parent_chain><item><author>magashna</author><text>&gt; “I feel like I’m beating my head against the wall,” Tom Rutledge, the chief executive officer of Charter Communications Inc., said during an earnings call last month. “It’s just too easy to get the product without paying for it.”<p>Boo hoo. All these companies thinking they can split all their content up and get people to pay for every single service are in for a (totally expected) surprise.<p>I&#x27;ll share my plex server with friends and family long before I pay for more than 1-2 services.</text></item></parent_chain><comment><author>basch</author><text>I think the bigger issue than paying for multiple services is duplicate licensing. If I sign up for DirecTV, Amazon Prime, and CBS how many shows am I triple licensing? If all these companies want me to sign up for multiple services, there needs to be a way for them to pass &quot;already licensed&quot; information to each other, and pro-rate their bills accordingly.<p>ATT, Comcast, Disney, CBS already own almost all the historical content. Id rather pay a small licensing fee to those 4 and then be able to watch their content on any platform, rather than deal with multiple middlemen that have CBS, Comcast, Disney, ATT licensing costs built into their product price. IF I select DirecTV&#x2F;ATT as my &quot;licensing manager&quot; then any time I sign up for Disney+, Peacock+, CBS All Access, I want a discount off one of the two ends (would make more sense to give all my subscription info to DirecTV and get a DirecTV discount. Then DirecTV can go to Disney and say &quot;17% of our customers dont need to pay Disney licensing costs, weve adjusted our payment to you accordingly.)</text></comment> | <story><title>Netflix, HBO and Cable Giants Are Coming for Password Sharers</title><url>https://www.bloomberg.com/news/articles/2019-11-08/netflix-hbo-and-cable-giants-are-coming-for-password-cheats</url></story><parent_chain><item><author>magashna</author><text>&gt; “I feel like I’m beating my head against the wall,” Tom Rutledge, the chief executive officer of Charter Communications Inc., said during an earnings call last month. “It’s just too easy to get the product without paying for it.”<p>Boo hoo. All these companies thinking they can split all their content up and get people to pay for every single service are in for a (totally expected) surprise.<p>I&#x27;ll share my plex server with friends and family long before I pay for more than 1-2 services.</text></item></parent_chain><comment><author>criley2</author><text>The entitlement people feel to steal is always surprising. Especially on this board. I understand a lot of reddit is younger and irresponsible so piracy-as-the-norm is a popular trope, but among the engineers whose livelihoods depend on subscription models, we still promote service theft without so much as a hint of shame?<p>If the economics of a service don&#x27;t make sense to you, then do not subscribe. But to turn around and steal from the engineers and artists who work to create these experiences is just indefensible. You are not entitled to television shows.</text></comment> |
5,736,516 | 5,736,342 | 1 | 3 | 5,734,680 | train | <story><title>Dear American Consumers: Please don’t start eating healthfully</title><url>http://blogs.scientificamerican.com/guest-blog/2013/05/19/dear-american-consumers-please-dont-start-eating-healthfully-sincerely-the-food-industry/</url></story><parent_chain><item><author>hcarvalhoalves</author><text>I don't think that works too well. There's a large factor on satiety which is how hungry, or even happy, you feel at the moment.<p>The most important factor for keeping calories intake low is making many small meals a day, and we won't solve that changing food, but lifestyle. The obesity epidemic is a direct result of sitting in an office 9 to 5 (or 6,7,8,...) and making two meals a day full of fat and sugar, because you're craving.<p>When you have a better lifestyle, you naturally gravitate towards better food, because you have time to cook (as opposed to pizza/snacks) or look for good dinner (as opposed to drive-thru), have time to go to the grocery store buy vegetables/fruits more frequently (as opposed to industrialized food which doesn't spoil), and even time to appreciate food itself making various small meals a day.<p>EDIT: I would like to know why the downvote. The correlation between obesity and office workers is a recurrent research topic.</text></item><item><author>jcampbell1</author><text>I have a solution to the obesity epidemic, that would be realistic to implement. Right now the factors for what we analyse in nutrition is calories, macro nutrient content, and micronutrient content. What we really care about is satiation and satiety. These are words that no one talks about, and but get to the core of the issue. If I eat this food, how many calories will I consume before feeling full?(satiation). After eating this food, how long until I feel the need to eat again? (sateity).<p>Unfortunately the research and measurements of these values is thin. We need to fix that. We already know things like whole milk better provides satiation and sateity than skim milk, and children that drink whole milk actaully have less obesity than skim milk drinkers.<p>If we just measured and labeled foods with a sateity/satiation index (what we really care about), then people would actually have a chance to pick better foods. Right now it is damn near impossible to determine if eating eggs and bacon for breakfast is more likely to drive over eating vs cereal. It can be measured, but we just don't do it.</text></item></parent_chain><comment><author>vidarh</author><text>&#62; The most important factor for keeping calories intake low is making many small meals a day<p>Is there any research that actually shows that?<p>My own personal experience with leangains (intermittent fasting; taking all my calories in an 8 hour window with lunch being my first and biggest meal) is that I find it far harder to overeat that way than with smaller meals. It's "easy" to fit in a snack between small meals, but after lunch I'm totally sick of food, and it's hard for me to even meet my macro nutrient goals (I lift weights; my protein intake is high). Often I have to force myself to take in a second meal to meet my calorie needs (edit: because I simply often don't <i>want</i> to eat again before the end of my eating window, around 8pm, after my massive lunch at noon).<p>On the opposite end, I see very few people around me that eat "only" two meals. My co-workers for example, might very well only sit down for a meal twice, but they snack constantly. And their "snacks" are often substantially higher calorie than their main meals. When I <i>did</i> try to diet with many small healthy meals, it was a nightmare - I was constantly hungry and constantly craving food.<p>When I do intermittent fasting, I have hunger pangs in the mornings 2-3 days, and then I stop being hungry in the mornings.</text></comment> | <story><title>Dear American Consumers: Please don’t start eating healthfully</title><url>http://blogs.scientificamerican.com/guest-blog/2013/05/19/dear-american-consumers-please-dont-start-eating-healthfully-sincerely-the-food-industry/</url></story><parent_chain><item><author>hcarvalhoalves</author><text>I don't think that works too well. There's a large factor on satiety which is how hungry, or even happy, you feel at the moment.<p>The most important factor for keeping calories intake low is making many small meals a day, and we won't solve that changing food, but lifestyle. The obesity epidemic is a direct result of sitting in an office 9 to 5 (or 6,7,8,...) and making two meals a day full of fat and sugar, because you're craving.<p>When you have a better lifestyle, you naturally gravitate towards better food, because you have time to cook (as opposed to pizza/snacks) or look for good dinner (as opposed to drive-thru), have time to go to the grocery store buy vegetables/fruits more frequently (as opposed to industrialized food which doesn't spoil), and even time to appreciate food itself making various small meals a day.<p>EDIT: I would like to know why the downvote. The correlation between obesity and office workers is a recurrent research topic.</text></item><item><author>jcampbell1</author><text>I have a solution to the obesity epidemic, that would be realistic to implement. Right now the factors for what we analyse in nutrition is calories, macro nutrient content, and micronutrient content. What we really care about is satiation and satiety. These are words that no one talks about, and but get to the core of the issue. If I eat this food, how many calories will I consume before feeling full?(satiation). After eating this food, how long until I feel the need to eat again? (sateity).<p>Unfortunately the research and measurements of these values is thin. We need to fix that. We already know things like whole milk better provides satiation and sateity than skim milk, and children that drink whole milk actaully have less obesity than skim milk drinkers.<p>If we just measured and labeled foods with a sateity/satiation index (what we really care about), then people would actually have a chance to pick better foods. Right now it is damn near impossible to determine if eating eggs and bacon for breakfast is more likely to drive over eating vs cereal. It can be measured, but we just don't do it.</text></item></parent_chain><comment><author>Mvandenbergh</author><text>&#62;The most important factor for keeping calories intake low is making many small meals a day, and we won't solve that changing food, but lifestyle.<p>Is that the case? Many countries that have much stricter cultural traditions about when and how much to eat (the French, the Italians) eat 2-3 meals a day. They don't eat when they feel hungry, the eat when it is eating time.</text></comment> |
20,335,693 | 20,335,721 | 1 | 2 | 20,334,924 | train | <story><title>Cloudflare Network Performance Issues</title><url>https://www.cloudflarestatus.com/incidents/tx4pgxs6zxdr</url></story><parent_chain><item><author>profmonocle</author><text>Once cloudflare.com came back I decided to check out their business SLA, and it&#x27;s not very encouraging:<p>&gt; For any and each Outage Period during a monthly billing period the Company will provide as a Service Credit an amount calculated as follows: Service Credit = (Outage Period minutes * Affected Customer Ratio) ÷ Scheduled Availability minutes<p>- <a href="https:&#x2F;&#x2F;www.cloudflare.com&#x2F;business-sla&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.cloudflare.com&#x2F;business-sla&#x2F;</a><p>So assuming an outage affects 100% of your users (this one seems like it did, but that&#x27;s not clear), they only refund the time the service was offline? According to pingdom this outage lasted ~25 minutes, so that&#x27;s 25&#x2F;(31 * 24 * 60) = .056% of our bill, roughly 11 cents.<p>It sounds like you just don&#x27;t pay for the time the service didn&#x27;t work, which isn&#x27;t much of a guarantee, that&#x27;s just expected (of course you shouldn&#x27;t pay for services not provided). Most SLAs for critical services have something like under 99.99% uptime you get 10% of your bill back, under 99.5% you get 20% back, under 99% you get 50% back. (*Numbers completely made up to demonstrate the concept.)<p>Am I misreading this? Morning coffee hasn&#x27;t kicked in yet so maybe I am.</text></item></parent_chain><comment><author>btown</author><text>This doesn&#x27;t surprise me at all - SLA&#x27;s are widely overrated. No SLA will cover damages incurred by lost business due to an outage. What you likely want is some kind of third-party insurance for downtime caused by outages out of your control - but I&#x27;m not even sure this exists.</text></comment> | <story><title>Cloudflare Network Performance Issues</title><url>https://www.cloudflarestatus.com/incidents/tx4pgxs6zxdr</url></story><parent_chain><item><author>profmonocle</author><text>Once cloudflare.com came back I decided to check out their business SLA, and it&#x27;s not very encouraging:<p>&gt; For any and each Outage Period during a monthly billing period the Company will provide as a Service Credit an amount calculated as follows: Service Credit = (Outage Period minutes * Affected Customer Ratio) ÷ Scheduled Availability minutes<p>- <a href="https:&#x2F;&#x2F;www.cloudflare.com&#x2F;business-sla&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.cloudflare.com&#x2F;business-sla&#x2F;</a><p>So assuming an outage affects 100% of your users (this one seems like it did, but that&#x27;s not clear), they only refund the time the service was offline? According to pingdom this outage lasted ~25 minutes, so that&#x27;s 25&#x2F;(31 * 24 * 60) = .056% of our bill, roughly 11 cents.<p>It sounds like you just don&#x27;t pay for the time the service didn&#x27;t work, which isn&#x27;t much of a guarantee, that&#x27;s just expected (of course you shouldn&#x27;t pay for services not provided). Most SLAs for critical services have something like under 99.99% uptime you get 10% of your bill back, under 99.5% you get 20% back, under 99% you get 50% back. (*Numbers completely made up to demonstrate the concept.)<p>Am I misreading this? Morning coffee hasn&#x27;t kicked in yet so maybe I am.</text></item></parent_chain><comment><author>zymhan</author><text>They reserve the best SLA for Enterprise, naturally.<p><a href="https:&#x2F;&#x2F;www.cloudflare.com&#x2F;plans&#x2F;enterprise&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.cloudflare.com&#x2F;plans&#x2F;enterprise&#x2F;</a><p>&gt; 100% uptime and 25x Enterprise SLA<p>&gt; In the rare event of downtime, Enterprise customers receive a 25x credit against the monthly fee, in proportion to the respective disruption and affected customer ratio.</text></comment> |
37,028,326 | 37,026,158 | 1 | 3 | 37,022,911 | train | <story><title>I went to 50 different dentists: almost all gave a different diagnosis (1997)</title><url>https://www.rd.com/article/how-honest-are-dentists/</url></story><parent_chain><item><author>ransom1538</author><text>&quot;The experts are great until they have a case that is unusual. &quot;<p>I was so pissed by this I created a way to find experts. People that actually studied the issue, have actual publish papers on the topic. What I found was no one cared. No one wanted &quot;experts&quot; - they want a specialist that is in their network and close.<p>Example of searching for mohs surgury.
<a href="https:&#x2F;&#x2F;www.opendoctor.io&#x2F;research&#x2F;?research_papers=mohs&amp;zip=32766&amp;search=search" rel="nofollow noreferrer">https:&#x2F;&#x2F;www.opendoctor.io&#x2F;research&#x2F;?research_papers=mohs&amp;zip...</a></text></item><item><author>basisword</author><text>&gt;&gt; I’m not a conspiracy nut. I believe in listening to experts (but ultimately making an informed decision). I believe in modern medicine. But that experience shook me and forever changed my trust in the dental industry.<p>Ask anybody who’s experienced a “chronic” illness about the “experts” and they’ll tell you a tale or two. The experts are great until they have a case that is unusual. The don’t have the time or knowledge to treat you properly. You get passed from “expert” to “expert” each time having your hopes dashed. You start feeling like a conspiracy nut chatting with other patients online sharing what’s anecdotally helped. After running into this issue more than once I’ve lost all blind trust in medical experts. I’ll verify what they tell me as best I can and get second opinions if necessary. In one case I was passed up the chain of experts until I finally found the right one myself after a year, and it still blows my mind that this wasn’t the first referral. The system is at the same time incredible and awful.</text></item><item><author>Waterluvian</author><text>My son when 3 had a fall and a few teeth were bent. Went to our local dentist who mostly had a wait and see opinion. But then calls a day later and says they’ve decided they should just come out. Two top front teeth. Would have no top front teeth for years.<p>I went into engineer mode and while I acknowledged I didn’t have domain expertise, I asked questions and probed the whole situation. Very unsatisfactory, meandering answers.<p>This was a deeply distressing experience. For the first time ever I did the “call in a personal favour” thing and asked my dad to reach out a family friend, a former cosmetic dentist and former head of the province’s dental association for a second opinion.<p>He saw my son a few hours later and he was just <i>livid</i> about the diagnosis. That it was possible they’d have to come out but it’s impossible to know this for at least a few more weeks or more.<p>In a few months the teeth returned 100% to normal and firmed right up as the ligaments healed.<p>I’m not a conspiracy nut. I believe in listening to experts (but ultimately making an informed decision). I believe in modern medicine. But that experience shook me and forever changed my trust in the dental industry.<p>My feeling is that the nature of dentistry leaves a lot of room for subjectivity and COVID left a lot of dental chairs empty.</text></item></parent_chain><comment><author>susiecambria</author><text>I&#x27;ve often been struck dumb by the passive nature of so many people when facing health problems. My dad had multiple myeloma and was satisfied with local hosp. Nope. Made him go to Yale and the Dana Farber. He got the best care possible.<p>I struggled with back pain and told to go to a neurologist and informed my GP where I would and wouldn&#x27;t go. The doc I was referred to was a no-go since I know him personally and he&#x27;s a lying piece of shit in his personal life, I&#x27;m not trusting him with my back! Docs could NOT understand. &quot;But he&#x27;s so nice.&quot; Screw that.<p>Time and again, family and friends settle. I understand the insurance component, but even then, can you not find better?<p>And thank you for your work. It&#x27;s very cool.</text></comment> | <story><title>I went to 50 different dentists: almost all gave a different diagnosis (1997)</title><url>https://www.rd.com/article/how-honest-are-dentists/</url></story><parent_chain><item><author>ransom1538</author><text>&quot;The experts are great until they have a case that is unusual. &quot;<p>I was so pissed by this I created a way to find experts. People that actually studied the issue, have actual publish papers on the topic. What I found was no one cared. No one wanted &quot;experts&quot; - they want a specialist that is in their network and close.<p>Example of searching for mohs surgury.
<a href="https:&#x2F;&#x2F;www.opendoctor.io&#x2F;research&#x2F;?research_papers=mohs&amp;zip=32766&amp;search=search" rel="nofollow noreferrer">https:&#x2F;&#x2F;www.opendoctor.io&#x2F;research&#x2F;?research_papers=mohs&amp;zip...</a></text></item><item><author>basisword</author><text>&gt;&gt; I’m not a conspiracy nut. I believe in listening to experts (but ultimately making an informed decision). I believe in modern medicine. But that experience shook me and forever changed my trust in the dental industry.<p>Ask anybody who’s experienced a “chronic” illness about the “experts” and they’ll tell you a tale or two. The experts are great until they have a case that is unusual. The don’t have the time or knowledge to treat you properly. You get passed from “expert” to “expert” each time having your hopes dashed. You start feeling like a conspiracy nut chatting with other patients online sharing what’s anecdotally helped. After running into this issue more than once I’ve lost all blind trust in medical experts. I’ll verify what they tell me as best I can and get second opinions if necessary. In one case I was passed up the chain of experts until I finally found the right one myself after a year, and it still blows my mind that this wasn’t the first referral. The system is at the same time incredible and awful.</text></item><item><author>Waterluvian</author><text>My son when 3 had a fall and a few teeth were bent. Went to our local dentist who mostly had a wait and see opinion. But then calls a day later and says they’ve decided they should just come out. Two top front teeth. Would have no top front teeth for years.<p>I went into engineer mode and while I acknowledged I didn’t have domain expertise, I asked questions and probed the whole situation. Very unsatisfactory, meandering answers.<p>This was a deeply distressing experience. For the first time ever I did the “call in a personal favour” thing and asked my dad to reach out a family friend, a former cosmetic dentist and former head of the province’s dental association for a second opinion.<p>He saw my son a few hours later and he was just <i>livid</i> about the diagnosis. That it was possible they’d have to come out but it’s impossible to know this for at least a few more weeks or more.<p>In a few months the teeth returned 100% to normal and firmed right up as the ligaments healed.<p>I’m not a conspiracy nut. I believe in listening to experts (but ultimately making an informed decision). I believe in modern medicine. But that experience shook me and forever changed my trust in the dental industry.<p>My feeling is that the nature of dentistry leaves a lot of room for subjectivity and COVID left a lot of dental chairs empty.</text></item></parent_chain><comment><author>atlas_hugged</author><text>Wow this is awesome. You should post this as it’s own show HN thingy.</text></comment> |
33,356,085 | 33,354,072 | 1 | 3 | 33,286,697 | train | <story><title>Engineers should do customer support and success</title><url>https://notoriousplg.substack.com/p/nplg-102022-engineering-customer</url></story><parent_chain><item><author>paxys</author><text>Every engineering-minded founder&#x2F;CEO will come up with this idea once in their career, get badly burned, and quickly realize why roles like customer success exist.<p>Having developers answer support tickets and follow up on leads is a terrible use of their time (especially considering how much you are paying them) and they will be terrible at it. In the worst case they will piss off customers and actually harm your business.<p>Imagine if the situation was flipped, and your CEO went &quot;sales reps need to understand engineering complexity, so from now on they will each have to fix 10 bugs a month.&quot;<p>If there are problems with bug &amp; feature prioritization, customer satisfaction or the product feedback loop, focus on fixing them the right way (and involving the right roles) instead of taking shortcuts.</text></item></parent_chain><comment><author>Beltalowda</author><text>I&#x27;ve done plenty of direct customer support as an engineer, and I think everyone won because of it.<p>This was usually on some more difficult issues, or issues that involved an actual bug. Why bother going through a support person if I can just talk to the customer myself and explain the issue better, or ask the right questions straight away? Seems to me that&#x27;s not a waste of time, but a time <i>saver</i> for everyone.<p>Same with feature requests; a support person can fob a person off with &quot;thank you for your feedback&quot; and create an internal issue with no context, or I can talk to the customer myself, get a good idea of what problems they&#x27;re running in to, and discus with them how to best fix it. Sometimes these are almost trivial fixes or changes, but a support person doesn&#x27;t always realize that. In general keeping an eye on the support inbox gives me a better idea of how customers are using our service and what kind of things they&#x27;re running in to.<p>The amount of time&#x2F;money spent on this is minimal, if it even exists at all, especially considering this is the sort of thing you can do a bit later in the day when you&#x27;re not producing the sharpest code in the first place (I usually don&#x27;t anyway). Even as a mostly backend engineer I&#x27;ve gotten a lot of value out of it, and I think our customers and the business overall have as well.<p>That said, <i>forcing</i> developers who don&#x27;t want to do this is probably a bad idea: they will do a shoddy half-job, be unhappy, and no one wins.</text></comment> | <story><title>Engineers should do customer support and success</title><url>https://notoriousplg.substack.com/p/nplg-102022-engineering-customer</url></story><parent_chain><item><author>paxys</author><text>Every engineering-minded founder&#x2F;CEO will come up with this idea once in their career, get badly burned, and quickly realize why roles like customer success exist.<p>Having developers answer support tickets and follow up on leads is a terrible use of their time (especially considering how much you are paying them) and they will be terrible at it. In the worst case they will piss off customers and actually harm your business.<p>Imagine if the situation was flipped, and your CEO went &quot;sales reps need to understand engineering complexity, so from now on they will each have to fix 10 bugs a month.&quot;<p>If there are problems with bug &amp; feature prioritization, customer satisfaction or the product feedback loop, focus on fixing them the right way (and involving the right roles) instead of taking shortcuts.</text></item></parent_chain><comment><author>trap_goes_hot</author><text>&quot;One test is worth a thousand opinions&quot;<p>I worked at a SMB where everyone in the company had to perform some customer facing duties every month. No, we weren&#x27;t terrible at it, and it helped all of us gain an appreciation of a different point of view. No customer was pissed off (in fact some were impressed when they realized they were talking to senior management), and while the company is no longer in business (for various other reasons) there was no noticeable harm caused by this.</text></comment> |
29,362,976 | 29,363,035 | 1 | 3 | 29,360,119 | train | <story><title>Ask HN: Software Engineer hitting 40: what's next?</title><text>I&#x27;ve been working in software engineering for 18 years. I worked mostly as individual contributor (now as a Senior Staff Engineer), also I was an Engineering Manager for couple years. Now I am interviewing after a few years at the company, and I am hit by harsh reality. For the context, I am in Europe, not in the US.<p>I like technologies and programming, I want to further improve my skills in designing and developing reliable and maintainable distributed system, make better technical decisions. Also, I want to keep learning and playing with new techs. I am now interviewing for the roles like Staff &#x2F; Principal Engineer, My expectations for the roles like Staff &#x2F; Principal Engineer are that while staying hands-on, say for 30%, I will primarily use more my skills in architecture, engineering, and communications to focus on large, important pieces of functionality, technical decisions with big impact, etc. I expect that I would report to a Director or VP level manager, so that I could be exposed to a big picture, collaborate with and learn from a professional who operated on strategic level.<p>In reality, I am now interviewing for Staff &#x2F; Principal roles and see a few problems that make me rethink my carrier plans. First, the definion for the most of those positions looks Senior Engineers with a few more years of experience: so you are limited to the scope of a single team scope, report to an Engineering manager, just be a worker at a feature conveyor, just be faster, mentor young workers, maybe get some devops skill. I feel limited in impact in such roles, my borders and carrier are defined by Engineer Managers, who are usually less experienced in engineering and leadership topics than I am. The work is also very repetitive, there is not much meaningful progression, next level. I think those titles are created to cover problems caused by diluted Senior titles: an illusional career progression candy for ICs with some salary increase.<p>I saw a few Staff &#x2F; Principal roles that put a very high bar on technical expertise, when only 3-4 percent of all the engineers have such levels, and again usually limited to a lot of coding and a single team scope. They usually have long exhaustive interview process.<p>An important problem with Staff+ IC roles is that there is a low salary limit as well, and you will face much more competition for top roles. Mostly salaries top at the level of a director of engineering. It is typical for a company to have 10 directors, but only 1-2 IC with a similar compensation.<p>I want to work hard, and see meaningful progression: in salary, in impact, in respect.<p>I would like to ask for advice. I believe there are qute a lot 35+ engineers here that faced similar problems and made some decisions for their careers. Now I think to plan switching to a EM track or to Technical Product management. Thank you!</text></story><parent_chain><item><author>asdfman123</author><text>I make like $320k, my wife makes $70k. Our rent in the bay is $2800&#x2F;mo and we spend about $4000&#x2F;mo on other expenses.<p>I just moved to the bay from the enterprise world, so my net worth is 400k. At this rate, assuming very conservative raises and 6% on all investments -- including already reserved stock units -- I&#x27;ll be worth $2.2m in five years (age 40). $7.5m by 50.<p>And that&#x27;s just assuming I hit $400k-ish in the next few years and stay there for the rest of my life, and my wife gets no significant raises. If I get higher, which is entirely possible, I&#x27;ll make even more. Or maybe at some point I&#x27;ll try for a startup and go for the really big money. Maybe not.<p>Of course, many of those assumptions could not work out. But the kind of exponential advantage of saving a large fraction of your paycheck eventually overcomes any disadvantages. Worst case scenario is tech goes bust and US stock market turns into Japan&#x27;s in the 80s, but in that case I&#x27;ll still be better off than most because I have a lot of savings and I&#x27;m not addicted to the paycheck.<p>I think the plan will be to do this for a few more years and consult part time. Full retirement would be too boring.</text></item><item><author>jstx1</author><text>You don&#x27;t just get a few million through work. Salaries aren&#x27;t high enough to save that aggressively.</text></item><item><author>trulyme</author><text>When you say &quot;you can&#x27;t&quot;, I assume you mean retire officially? I imagine that having a few million euros in the bank (or in different assets) would mean that work is optional.</text></item><item><author>bjornsing</author><text>Sounds great, if you can retire at 50. In the EU you can’t (or at least not in Sweden where I am), so you have to think about how sustainable this approach is past 50… and to me it doesn’t really feel sustainable.</text></item><item><author>leet_thow</author><text>I&#x27;m 42 and have stopped paying attention to titles and all the traditional organizational paradigms that are losing relevance.<p>I feel like the ability to work from home in my sweats on simple problems as a senior engineer and receive a 75th percentile income relative to my neighbors in one of the best neighborhoods in my new home state is the most societal progress I will ever experience in my lifetime. I&#x27;m a lifelong bachelor by choice. Why bother striving for anything career wise when I am on track to retire comfortably to focus on my mostly free hobbies no later than the age of 50? For a house with a 3rd bedroom I don&#x27;t need?<p>No, best to appreciate what I have and leave the striving for the next generation of engineers.</text></item></parent_chain><comment><author>spiderice</author><text>This is the most out of touch thing I’ve read on HN in a while. When did we start talking about whether couples bringing in $400k&#x2F;year could retire at age 50. You are clearly an outlier, even when compared to software engineers in the states, let alone Europe.</text></comment> | <story><title>Ask HN: Software Engineer hitting 40: what's next?</title><text>I&#x27;ve been working in software engineering for 18 years. I worked mostly as individual contributor (now as a Senior Staff Engineer), also I was an Engineering Manager for couple years. Now I am interviewing after a few years at the company, and I am hit by harsh reality. For the context, I am in Europe, not in the US.<p>I like technologies and programming, I want to further improve my skills in designing and developing reliable and maintainable distributed system, make better technical decisions. Also, I want to keep learning and playing with new techs. I am now interviewing for the roles like Staff &#x2F; Principal Engineer, My expectations for the roles like Staff &#x2F; Principal Engineer are that while staying hands-on, say for 30%, I will primarily use more my skills in architecture, engineering, and communications to focus on large, important pieces of functionality, technical decisions with big impact, etc. I expect that I would report to a Director or VP level manager, so that I could be exposed to a big picture, collaborate with and learn from a professional who operated on strategic level.<p>In reality, I am now interviewing for Staff &#x2F; Principal roles and see a few problems that make me rethink my carrier plans. First, the definion for the most of those positions looks Senior Engineers with a few more years of experience: so you are limited to the scope of a single team scope, report to an Engineering manager, just be a worker at a feature conveyor, just be faster, mentor young workers, maybe get some devops skill. I feel limited in impact in such roles, my borders and carrier are defined by Engineer Managers, who are usually less experienced in engineering and leadership topics than I am. The work is also very repetitive, there is not much meaningful progression, next level. I think those titles are created to cover problems caused by diluted Senior titles: an illusional career progression candy for ICs with some salary increase.<p>I saw a few Staff &#x2F; Principal roles that put a very high bar on technical expertise, when only 3-4 percent of all the engineers have such levels, and again usually limited to a lot of coding and a single team scope. They usually have long exhaustive interview process.<p>An important problem with Staff+ IC roles is that there is a low salary limit as well, and you will face much more competition for top roles. Mostly salaries top at the level of a director of engineering. It is typical for a company to have 10 directors, but only 1-2 IC with a similar compensation.<p>I want to work hard, and see meaningful progression: in salary, in impact, in respect.<p>I would like to ask for advice. I believe there are qute a lot 35+ engineers here that faced similar problems and made some decisions for their careers. Now I think to plan switching to a EM track or to Technical Product management. Thank you!</text></story><parent_chain><item><author>asdfman123</author><text>I make like $320k, my wife makes $70k. Our rent in the bay is $2800&#x2F;mo and we spend about $4000&#x2F;mo on other expenses.<p>I just moved to the bay from the enterprise world, so my net worth is 400k. At this rate, assuming very conservative raises and 6% on all investments -- including already reserved stock units -- I&#x27;ll be worth $2.2m in five years (age 40). $7.5m by 50.<p>And that&#x27;s just assuming I hit $400k-ish in the next few years and stay there for the rest of my life, and my wife gets no significant raises. If I get higher, which is entirely possible, I&#x27;ll make even more. Or maybe at some point I&#x27;ll try for a startup and go for the really big money. Maybe not.<p>Of course, many of those assumptions could not work out. But the kind of exponential advantage of saving a large fraction of your paycheck eventually overcomes any disadvantages. Worst case scenario is tech goes bust and US stock market turns into Japan&#x27;s in the 80s, but in that case I&#x27;ll still be better off than most because I have a lot of savings and I&#x27;m not addicted to the paycheck.<p>I think the plan will be to do this for a few more years and consult part time. Full retirement would be too boring.</text></item><item><author>jstx1</author><text>You don&#x27;t just get a few million through work. Salaries aren&#x27;t high enough to save that aggressively.</text></item><item><author>trulyme</author><text>When you say &quot;you can&#x27;t&quot;, I assume you mean retire officially? I imagine that having a few million euros in the bank (or in different assets) would mean that work is optional.</text></item><item><author>bjornsing</author><text>Sounds great, if you can retire at 50. In the EU you can’t (or at least not in Sweden where I am), so you have to think about how sustainable this approach is past 50… and to me it doesn’t really feel sustainable.</text></item><item><author>leet_thow</author><text>I&#x27;m 42 and have stopped paying attention to titles and all the traditional organizational paradigms that are losing relevance.<p>I feel like the ability to work from home in my sweats on simple problems as a senior engineer and receive a 75th percentile income relative to my neighbors in one of the best neighborhoods in my new home state is the most societal progress I will ever experience in my lifetime. I&#x27;m a lifelong bachelor by choice. Why bother striving for anything career wise when I am on track to retire comfortably to focus on my mostly free hobbies no later than the age of 50? For a house with a 3rd bedroom I don&#x27;t need?<p>No, best to appreciate what I have and leave the striving for the next generation of engineers.</text></item></parent_chain><comment><author>lumost</author><text>At present trajectory this is a reasonable outcome, in the case of a major correction saving your money will also mean you benefited from the bubble.<p>There used to be stories from the .com bubble of people who only knew a little html getting six figures. On being told this around ‘08 everyone would nod and think that it was ridiculous - after all fresh college grad backend engineers could be hired for just 30k at the time on the low end in the US.<p>Flash forward to today and we’d think a front end engineer or designer with technical knowledge for those rates would be a steal even after adjusting for inflation.<p>I don’t doubt techs impact, or that the big tech should be the most valuable companies in the world. But I would be surprised if software engineer became the highest paid profession on a permanent basis, which with current salary progressions is a given without a correction.</text></comment> |
6,284,080 | 6,283,873 | 1 | 3 | 6,283,309 | train | <story><title>What is algebra?</title><url>http://profkeithdevlin.org/2011/11/20/what-is-algebra/</url></story><parent_chain><item><author>electrograv</author><text><i>&gt; ... numbers in general, not particular numbers. And the human brain is not naturally suited to think at that level of abstraction.</i><p>This is so wrong (at least as a stereotype).<p>All throughout my early education I <i>HATED</i> arithmetic, and found almost everything about it mind-numbingly boring and repulsively repetitive. At that point in my life, I hated math. The moment I encountered algebra though, it was &quot;love at first sight&quot;, and ever since I&#x27;ve absolutely been fascinated and engaged with every type of high-level math I encounter (the more abstract, the better). And not just &quot;fascinated&quot; in the &quot;I like it&quot; sense -- math, CS, etc. is more easy&#x2F;natural to me than most humanities subjects, by far.<p>So although I can only speak for myself, I quite disagree with any claim that the brain isn&#x27;t naturally suited to abstract thinking. While I know not all people think the way I do, certainly quite a few do.</text></item></parent_chain><comment><author>B-Con</author><text>His area of expertise is mathematical teaching&#x2F;learning. He was undoubtedly talking about the average person, I doubt it was supposed to be an absolute neurological-level statement. And as a general statement&#x2F;stereotype, there are expected to be exceptions. I doubt he got to be a Stanford math professor without seeing some gifted math students himself. His statement can still stand, despite yourself as a counterexample.<p>A little more about him: His CourseEra course is &quot;Introduction to Mathematical Thinking&quot;. It isn&#x27;t about math, it&#x27;s about how to think mathematically. He commonly talks about the pitfalls people make with basic mathematical approaches. He works with helping them understand approaches to math and how to deal with thinking abstractly and purely logically. Some people pick up all that stuff implicitly with little effort, some people never really master it. Given his position, I think he sees a pretty raw view of the average person&#x27;s <i>approach</i> toward math.</text></comment> | <story><title>What is algebra?</title><url>http://profkeithdevlin.org/2011/11/20/what-is-algebra/</url></story><parent_chain><item><author>electrograv</author><text><i>&gt; ... numbers in general, not particular numbers. And the human brain is not naturally suited to think at that level of abstraction.</i><p>This is so wrong (at least as a stereotype).<p>All throughout my early education I <i>HATED</i> arithmetic, and found almost everything about it mind-numbingly boring and repulsively repetitive. At that point in my life, I hated math. The moment I encountered algebra though, it was &quot;love at first sight&quot;, and ever since I&#x27;ve absolutely been fascinated and engaged with every type of high-level math I encounter (the more abstract, the better). And not just &quot;fascinated&quot; in the &quot;I like it&quot; sense -- math, CS, etc. is more easy&#x2F;natural to me than most humanities subjects, by far.<p>So although I can only speak for myself, I quite disagree with any claim that the brain isn&#x27;t naturally suited to abstract thinking. While I know not all people think the way I do, certainly quite a few do.</text></item></parent_chain><comment><author>samatman</author><text>The difference between categorical and analytical modes of thought is not always clear to categorical thinkers, but is usually well-known to the analytical. I&#x27;m heavily in the analytical camp, went straight from high-school level algebra to the theory of formal systems and Godel numbering, and didn&#x27;t &#x27;click&#x27; with my scholastic maths education until linear algebra.<p>Linear algebra was exactly and precisely when I realized that &quot;algebra&quot; was a special case of algebras, which was what I needed to contextualize it. Before that it was a bunch of wasted rote effort.</text></comment> |
40,693,664 | 40,693,723 | 1 | 3 | 40,693,500 | train | <story><title>SQLite is likely used more than all other database engines combined</title><url>https://sqlite.org/mostdeployed.html</url></story><parent_chain></parent_chain><comment><author>paradox460</author><text>In the past, I&#x27;ve used it as a file format for an online application. Users often wanted to download bits and bobs of data from the app, not to view it or edit it on their desktop, but more as backups or to share a particular configuration
Previously we&#x27;d used a json+zip pair, but the lack of enforced schema became a problem. Switching over to a gzipped sqlite db with a &quot;custom&quot; file extension worked incredibly well</text></comment> | <story><title>SQLite is likely used more than all other database engines combined</title><url>https://sqlite.org/mostdeployed.html</url></story><parent_chain></parent_chain><comment><author>qazxcvbnmlp</author><text>SQLite solves a lot of problems well enough that you can focus on other things.<p>Have an mvp you’re not sure you’ll ever have more that 2 users - yep.<p>Storing a little data for an application on the disk and don’t want to write your own schema.<p>Want to teach someone how databases work without setting up a sever - sure.</text></comment> |
28,471,230 | 28,469,966 | 1 | 3 | 28,469,520 | train | <story><title>Ministry of Freedom – GNU+Linux laptops with Libreboot preinstalled</title><url>https://minifree.org</url></story><parent_chain></parent_chain><comment><author>marcodiego</author><text>The girl who runs minifree has had many financial troubles while trying to keep it.<p>I strongly recommend people buying products from people who are willing to make sacrifices to offer a product that respects your freedom.<p>If we do not support people like her, we assume the future risk of having zero costumer really owned devices.<p>Whenever you plan to buy a device and care about not being spied and having control over your owned device, please consider supporting vendors listed here: <a href="https:&#x2F;&#x2F;ryf.fsf.org&#x2F;" rel="nofollow">https:&#x2F;&#x2F;ryf.fsf.org&#x2F;</a></text></comment> | <story><title>Ministry of Freedom – GNU+Linux laptops with Libreboot preinstalled</title><url>https://minifree.org</url></story><parent_chain></parent_chain><comment><author>dmos62</author><text>I&#x27;m hopeful that open processors like RISC will be a big step in solving this. But, then there will still be all that other blob-y, closed hardware like SSDs, network cards, radios. In my humble opinion, there&#x27;s something wrong with everyone having to use hardware (and software to a slightly lesser extent) that&#x27;s not auditable and not patchable (by you). There should be a legislative framework for consumer protection.</text></comment> |
13,944,215 | 13,943,536 | 1 | 3 | 13,940,412 | train | <story><title>Modern JavaScript for Ancient Web Developers</title><url>https://trackchanges.postlight.com/modern-javascript-for-ancient-web-developers-58e7cae050f9#.2tpky4xfc</url></story><parent_chain><item><author>Waterluvian</author><text>The trick to being successful with JavaScript is to relax and allow yourself to slightly sink into your office chair as a gelatinous blob of developer.<p>When you feel yourself getting all rigid and tense in the muscles, say, because you read an article about how you&#x27;re doing it wrong or that your favourite libraries are dead-ends, just take a deep breath and patiently allow yourself to return to your gelatinous form.<p>Now I know what you&#x27;re thinking, &quot;that&#x27;s good and all, but I&#x27;ll just slowly become an obsolete blob of goo in an over-priced, surprisingly uncomfortable, but good looking office chair. I like money, but at my company they don&#x27;t pay the non-performing goo-balls.&quot; Which is an understandable concern, but before we address it, notice how your butt no-longer feels half sore, half numb when in goo form, and how nice that kind of is. Ever wonder what that third lever under your chair does? Now&#x27;s a perfect time to find out!<p>As long as you accept that you&#x27;re always going to be doing it wrong, that there&#x27;s always a newer library, and that your code will never scale infinitely on the first try, you&#x27;ll find that you can succeed and remain gelatinous. Pick a stack then put on the blinders until its time to refactor&#x2F;rebuild for the next order of magnitude of scaling, or the next project.</text></item></parent_chain><comment><author>Joeri</author><text>In my experience using last year&#x27;s darling framework instead of this year&#x27;s is an excellent way to go. The codebase is stable and full-featured, the plugin ecosystem is rich, the documentation is extensive, and stackoverflow is already filled with accepted answers.<p>It&#x27;s a myth that new languages and frameworks offer better productivity. Mostly it&#x27;s just the flavor that&#x27;s different, not the calories. Sometimes you get a genuine improvement, like react, but the vast majority of frameworks are completely optional when it comes to getting stuff done. I&#x27;m always reminded of this when I observe the Delphi team at work. Still lapping any web developer when it comes to shipping features.</text></comment> | <story><title>Modern JavaScript for Ancient Web Developers</title><url>https://trackchanges.postlight.com/modern-javascript-for-ancient-web-developers-58e7cae050f9#.2tpky4xfc</url></story><parent_chain><item><author>Waterluvian</author><text>The trick to being successful with JavaScript is to relax and allow yourself to slightly sink into your office chair as a gelatinous blob of developer.<p>When you feel yourself getting all rigid and tense in the muscles, say, because you read an article about how you&#x27;re doing it wrong or that your favourite libraries are dead-ends, just take a deep breath and patiently allow yourself to return to your gelatinous form.<p>Now I know what you&#x27;re thinking, &quot;that&#x27;s good and all, but I&#x27;ll just slowly become an obsolete blob of goo in an over-priced, surprisingly uncomfortable, but good looking office chair. I like money, but at my company they don&#x27;t pay the non-performing goo-balls.&quot; Which is an understandable concern, but before we address it, notice how your butt no-longer feels half sore, half numb when in goo form, and how nice that kind of is. Ever wonder what that third lever under your chair does? Now&#x27;s a perfect time to find out!<p>As long as you accept that you&#x27;re always going to be doing it wrong, that there&#x27;s always a newer library, and that your code will never scale infinitely on the first try, you&#x27;ll find that you can succeed and remain gelatinous. Pick a stack then put on the blinders until its time to refactor&#x2F;rebuild for the next order of magnitude of scaling, or the next project.</text></item></parent_chain><comment><author>barking</author><text>I am a vb6 developer and I approve this message.</text></comment> |
17,271,693 | 17,271,703 | 1 | 3 | 17,270,749 | train | <story><title>Slip Coaches: Back When British Express Trains Detached Passenger Cars at Speed</title><url>https://99percentinvisible.org/article/slip-coaches-back-when-british-express-trains-detached-passenger-cars-at-speed/</url></story><parent_chain><item><author>5DFractalTetris</author><text>...But could you match a platform to a slipcoach&#x27;s velocity?</text></item></parent_chain><comment><author>cup-of-tea</author><text>I remember reading a proposal for a system where a high speed train would travel non-stop for the entire length of its route. In halfway towns a shuttle would be sent out to match the train&#x27;s speed, interface with it to exchange passengers then detach to return to the halfway town platform.<p>It sounds unbelievable, but remember it was once considered insane to consider building a railway between Liverpool and Manchester.</text></comment> | <story><title>Slip Coaches: Back When British Express Trains Detached Passenger Cars at Speed</title><url>https://99percentinvisible.org/article/slip-coaches-back-when-british-express-trains-detached-passenger-cars-at-speed/</url></story><parent_chain><item><author>5DFractalTetris</author><text>...But could you match a platform to a slipcoach&#x27;s velocity?</text></item></parent_chain><comment><author>DoreenMichele</author><text>Your question makes me think of how some airplanes refuel in flight, which blows my mind.</text></comment> |
9,194,072 | 9,192,516 | 1 | 3 | 9,191,007 | train | <story><title>FCC Releases Open Internet Order</title><url>http://www.fcc.gov/document/fcc-releases-open-internet-order</url></story><parent_chain><item><author>smutticus</author><text>The opinions of the dissenting FCC commissioners are exactly what you expect. They claim the FCC is engaging in over reach.<p>I&#x27;m curious to hear more about the &#x27;15 &quot;Broadband Subscriber Access Services&quot; pages&#x27; you mention. Where is this mentioned?<p>You might want to check out a recent talk @ NANOG from John Yoo.
<a href="https://www.youtube.com/watch?v=dVJV1gWYPX8" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=dVJV1gWYPX8</a><p>IMO the questions are more interesting than the actual preso.<p>What we got with this FCC ruling is an arrangement between the ISPs and the edge. Or you might call them the pipes and the content. It&#x27;s striking to me how little discussions of the last-mile were framing the debate, or how little consumer choice was taken into consideration. According to Wheeler, my interests as a consumer end at me not having my traffic biased. When in fact, most consumers are more interested in having choice in last-mile providers, or just paying less for their Internet connectivity. In short, I&#x27;m bummed we didn&#x27;t get unbundling in the last-mile.<p>The pipes got a monopoly. Content got the right to offer services without having to pay the pipes for the privilege. Google got the right to access utility poles. Google is the interesting hybrid with a history in content, but quickly moving into pipes.<p>None of these interests really care if you or I have choice in our last-mile ISP, or how much we pay. So Comcast, TWC and Verizon get handed a monopoly by the FCC. Americans shouldn&#x27;t have to clamor and beg Google to deliver fiber to their homes.<p>Also, what happens once Google is everywhere and they turn tyrannical? Do we need to have yet another last-mile provider invest billions to deliver service to our homes? How many Internet wires do I need running into my house to get competition? Imagine how stupid it would seem if I had multiple water pipes running into my house, or electricity wires.<p>The FCC missed an opportunity to reframe the debate away from net neutrality and towards monopoly in the last mile. If I had choice in last-mile providers net neutralty wouldn&#x27;t be an issue. If my provider treated me like crap I could simply choose another one.</text></item><item><author>rektide</author><text>Are the dissenting opinions available?<p>The FCC delayed this release, declaring that they were obligated to respond to dissenters and that they were going to hold off releasing until dissenters had written up their opinions and they could prepare a rebuttal. (This strikes me as a surprisingly sensible, all-cards-on-the-table governance model, rather than an ongoing media-frenzy feeding blow-by-blow release.)<p>It pains me greatly knowing that Google worked hard to get a piece of this document dropped. Alas due to a lack of transparency and openness among the FCC and Google&#x27;s discussion, we&#x27;ll never have more than faint rumormongering to do here, but in tandem to &quot;Broadband Internet Access Services&quot; it seemed like there was to be another leg of openness, &quot;Broadband Subscriber Access Services.&quot; Google not only got to know what was being proposed but got to get it dropped before the public had any idea what was under consideration here.
<a href="http://www.theregister.co.uk/2015/02/26/net_neutrality_rules/?mt=1426176998774" rel="nofollow">http:&#x2F;&#x2F;www.theregister.co.uk&#x2F;2015&#x2F;02&#x2F;26&#x2F;net_neutrality_rules...</a><p>To me it comes off as a denigration of the public by the FCC and Google, and the scant evidence about is hard to interpret as anything but subterfuge and sabotage done at the very last minute. A more transparent process would have been appreciated- rather than this &quot;Government as a Service&quot; model of building regulations and applying them, I&#x27;d like to have seen the FCC use some transparency and openness in their creation of these regulations to bootstrap regulations better meeting the public mandate.<p>I expect most registered, official dissent is going to focus on the FCC doing too much, and that no one is going to dissent saying these protections fail to address core points. But gee do I want to know what was in the 15 &quot;Broadband Subscriber Access Services&quot; pages the FCC dropped at the last minute, and why Google lobbied to drop them.</text></item></parent_chain><comment><author>rektide</author><text><i>What we got with this FCC ruling is an arrangement between the ISPs and the edge. Or you might call them the pipes and the content. It&#x27;s striking to me how little discussions of the last-mile were framing the debate, or how little consumer choice was taken into consideration. According to Wheeler, my interests as a consumer end at me not having my traffic biased. When in fact, most consumers are more interested in having choice in last-mile providers, or just paying less for their Internet connectivity. In short, I&#x27;m bummed we didn&#x27;t get unbundling in the last-mile.</i><p>I agree 100%. I was agape that Wheeler at least twice used his old company NABU as an opening shtick- a story about trying to create a competitive cable provider, but not having unbundling to pull it off; for him to transition into regulation instead of competition was comedically dissonant.<p>As for actually undoing bundling, it&#x27;s the courts. They decided that given that fiber investments were going to be costly, and given the threat that build-outs might not happen, the scales somehow tipped to allow them to overturn (on purely economic merits) the FCC&#x27;s rules on local loop unbundling.<p><i>Thus, we determine that, particularly in light of a competitive landscape in which competitive LECs are leading the deployment of FTTH, removing incumbent LEC unbundling obligations on FTTH loops will promote their deployment of the network infrastructure necessary to provide broadband services to the mass market.</i><p><a href="https://apps.fcc.gov/edocs_public/attachmatch/FCC-03-36A1.pdf" rel="nofollow">https:&#x2F;&#x2F;apps.fcc.gov&#x2F;edocs_public&#x2F;attachmatch&#x2F;FCC-03-36A1.pd...</a> , paragraph 278.<p>Truly one of the worst, least well-structured Court rulings to have occurred. And worse, here we are almost 15 years in, with fiber no longer rolling out, and there&#x27;s no structure or hope to revise this court edict made at a particular time balancing particular economic factors to demand a reassesment- the FCC&#x27;s Triennial Review.<p>On the bright side, I&#x27;m very happy that on the same day that the FCC announced Open Access approval, they also announced they were going to move to block states and municipalities from legally obstructing community and municipal competitors.</text></comment> | <story><title>FCC Releases Open Internet Order</title><url>http://www.fcc.gov/document/fcc-releases-open-internet-order</url></story><parent_chain><item><author>smutticus</author><text>The opinions of the dissenting FCC commissioners are exactly what you expect. They claim the FCC is engaging in over reach.<p>I&#x27;m curious to hear more about the &#x27;15 &quot;Broadband Subscriber Access Services&quot; pages&#x27; you mention. Where is this mentioned?<p>You might want to check out a recent talk @ NANOG from John Yoo.
<a href="https://www.youtube.com/watch?v=dVJV1gWYPX8" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=dVJV1gWYPX8</a><p>IMO the questions are more interesting than the actual preso.<p>What we got with this FCC ruling is an arrangement between the ISPs and the edge. Or you might call them the pipes and the content. It&#x27;s striking to me how little discussions of the last-mile were framing the debate, or how little consumer choice was taken into consideration. According to Wheeler, my interests as a consumer end at me not having my traffic biased. When in fact, most consumers are more interested in having choice in last-mile providers, or just paying less for their Internet connectivity. In short, I&#x27;m bummed we didn&#x27;t get unbundling in the last-mile.<p>The pipes got a monopoly. Content got the right to offer services without having to pay the pipes for the privilege. Google got the right to access utility poles. Google is the interesting hybrid with a history in content, but quickly moving into pipes.<p>None of these interests really care if you or I have choice in our last-mile ISP, or how much we pay. So Comcast, TWC and Verizon get handed a monopoly by the FCC. Americans shouldn&#x27;t have to clamor and beg Google to deliver fiber to their homes.<p>Also, what happens once Google is everywhere and they turn tyrannical? Do we need to have yet another last-mile provider invest billions to deliver service to our homes? How many Internet wires do I need running into my house to get competition? Imagine how stupid it would seem if I had multiple water pipes running into my house, or electricity wires.<p>The FCC missed an opportunity to reframe the debate away from net neutrality and towards monopoly in the last mile. If I had choice in last-mile providers net neutralty wouldn&#x27;t be an issue. If my provider treated me like crap I could simply choose another one.</text></item><item><author>rektide</author><text>Are the dissenting opinions available?<p>The FCC delayed this release, declaring that they were obligated to respond to dissenters and that they were going to hold off releasing until dissenters had written up their opinions and they could prepare a rebuttal. (This strikes me as a surprisingly sensible, all-cards-on-the-table governance model, rather than an ongoing media-frenzy feeding blow-by-blow release.)<p>It pains me greatly knowing that Google worked hard to get a piece of this document dropped. Alas due to a lack of transparency and openness among the FCC and Google&#x27;s discussion, we&#x27;ll never have more than faint rumormongering to do here, but in tandem to &quot;Broadband Internet Access Services&quot; it seemed like there was to be another leg of openness, &quot;Broadband Subscriber Access Services.&quot; Google not only got to know what was being proposed but got to get it dropped before the public had any idea what was under consideration here.
<a href="http://www.theregister.co.uk/2015/02/26/net_neutrality_rules/?mt=1426176998774" rel="nofollow">http:&#x2F;&#x2F;www.theregister.co.uk&#x2F;2015&#x2F;02&#x2F;26&#x2F;net_neutrality_rules...</a><p>To me it comes off as a denigration of the public by the FCC and Google, and the scant evidence about is hard to interpret as anything but subterfuge and sabotage done at the very last minute. A more transparent process would have been appreciated- rather than this &quot;Government as a Service&quot; model of building regulations and applying them, I&#x27;d like to have seen the FCC use some transparency and openness in their creation of these regulations to bootstrap regulations better meeting the public mandate.<p>I expect most registered, official dissent is going to focus on the FCC doing too much, and that no one is going to dissent saying these protections fail to address core points. But gee do I want to know what was in the 15 &quot;Broadband Subscriber Access Services&quot; pages the FCC dropped at the last minute, and why Google lobbied to drop them.</text></item></parent_chain><comment><author>supergeek133</author><text>The general response I&#x27;ve seen to that is they&#x27;re encouraging more last-mile providers to be developed from Title II opening up the equipment poles&#x2F;lines.<p>Basically, making it easier for the Google Fiber types to do their thing.<p>But then they have to live under the same rules as the big guys, and that was one of the rebuttal points. Valid or not.</text></comment> |
35,146,537 | 35,146,584 | 1 | 2 | 35,146,081 | train | <story><title>High-Throughput Generative Inference of Large Language Models with a Single GPU</title><url>https://arxiv.org/abs/2303.06865</url></story><parent_chain><item><author>TaylorAlexander</author><text>I genuinely believe LLMs are not enough on their own to generate the most interesting and humanlike interactions, but it is clear to me that we are on this path.<p>I was watching Westworld the TV series recently, and it made me think of what it would be like to be in a VR space with a bunch of characters that all had their own back story and personalities, which were sufficiently humanlike that it was fun to talk to them for long periods of time. I feel like we will see this in maybe ten years, and it&#x27;s going to be pretty cool! I mean I am sure we will see attempts at this sooner but I think it will take a while to get all the pieces working so that a conversational AI really feels human. Maybe 5 years, who knows.</text></item><item><author>georgehill</author><text>&gt; FlexGen lowers the resource requirements of running
175B-scale models down to a single 16GB GPU and reaches a generation throughput of 1 token&#x2F;s with an effective batch size
of 144.<p>I can&#x27;t imagine what will be happening in LLM space next year this time. Maybe LLM natively integrated into games and browsers.</text></item></parent_chain><comment><author>noduerme</author><text>If something like GPT-4 can batch the language tokens in with multi modal tokens (e.g. the character LLM gets a low-res &quot;video&quot; feed of you, the player, from the character&#x27;s perspective)... you could have it a lot sooner.</text></comment> | <story><title>High-Throughput Generative Inference of Large Language Models with a Single GPU</title><url>https://arxiv.org/abs/2303.06865</url></story><parent_chain><item><author>TaylorAlexander</author><text>I genuinely believe LLMs are not enough on their own to generate the most interesting and humanlike interactions, but it is clear to me that we are on this path.<p>I was watching Westworld the TV series recently, and it made me think of what it would be like to be in a VR space with a bunch of characters that all had their own back story and personalities, which were sufficiently humanlike that it was fun to talk to them for long periods of time. I feel like we will see this in maybe ten years, and it&#x27;s going to be pretty cool! I mean I am sure we will see attempts at this sooner but I think it will take a while to get all the pieces working so that a conversational AI really feels human. Maybe 5 years, who knows.</text></item><item><author>georgehill</author><text>&gt; FlexGen lowers the resource requirements of running
175B-scale models down to a single 16GB GPU and reaches a generation throughput of 1 token&#x2F;s with an effective batch size
of 144.<p>I can&#x27;t imagine what will be happening in LLM space next year this time. Maybe LLM natively integrated into games and browsers.</text></item></parent_chain><comment><author>sebastianconcpt</author><text>Some kind of return to the old days were Sierra games interaction was based on text input?</text></comment> |
24,403,008 | 24,402,904 | 1 | 2 | 24,402,419 | train | <story><title>Optimize Onboarding</title><url>https://staysaasy.com/management/2020/08/28/Optimize-Onboarding.html</url></story><parent_chain></parent_chain><comment><author>pathseeker</author><text>&gt;It takes roughly 2 weeks to form a habit; it takes roughly two weeks to get comfortable in a new environment. A common mistake is to treat a new report’s first couple weeks like college orientation - social, light hearted, get-to-know-you stuff. If your report spends the first two weeks reading C# documentation and having lunch out on the town with the team, guess what, they’ve just normalized that behavior as what the role is.<p>Humans are not dogs. I&#x27;ve worked at companies with both styles of on-boarding (two weeks of doing nothing vs jumping right in). The output in a month was realistically no different.</text></comment> | <story><title>Optimize Onboarding</title><url>https://staysaasy.com/management/2020/08/28/Optimize-Onboarding.html</url></story><parent_chain></parent_chain><comment><author>interrupt_</author><text>I once got told by a manager that they didn&#x27;t expect new hires to make big contributions in the first year. That was after I complained I was feeling very unproductive and wanted some help to speed things up. I quit after a few months because the slowness of everything around me was making me depressed.<p>This was at a top5 website company.</text></comment> |
19,435,055 | 19,434,945 | 1 | 2 | 19,420,212 | train | <story><title>Pixelfed – An alternative to centralized image sharing platforms</title><url>https://pixelfed.social/site/about</url></story><parent_chain><item><author>jedberg</author><text>I clicked the link and... have no idea what this is. It looks like instagram maybe? I can&#x27;t even figure out how to figure out what it is. The only clickable thing I see is &quot;@admin&quot; which just takes me to what looks like an Instagram page. It doesn&#x27;t even mention fediverse on that page, and even if it did, you&#x27;d have to know what the fediverse is.<p>This feels like a huge barrier to adoption for them beyond the nerdiest of nerds.<p>I like Instagram because I get to see what my friends are up to. How will I get my friends on this?</text></item></parent_chain><comment><author>ocdtrekkie</author><text>Most people who might use a service like this won&#x27;t know and maybe won&#x27;t care what the fediverse is and what other services it&#x27;s connected to. The goal should be for that to simply be the underlying standard everything else is built on, regardless of which site&#x2F;client you use to connect to people.<p>But yeah, this is more or less federated Instagram, but the federation is still a work-in-progress, AFAIK.</text></comment> | <story><title>Pixelfed – An alternative to centralized image sharing platforms</title><url>https://pixelfed.social/site/about</url></story><parent_chain><item><author>jedberg</author><text>I clicked the link and... have no idea what this is. It looks like instagram maybe? I can&#x27;t even figure out how to figure out what it is. The only clickable thing I see is &quot;@admin&quot; which just takes me to what looks like an Instagram page. It doesn&#x27;t even mention fediverse on that page, and even if it did, you&#x27;d have to know what the fediverse is.<p>This feels like a huge barrier to adoption for them beyond the nerdiest of nerds.<p>I like Instagram because I get to see what my friends are up to. How will I get my friends on this?</text></item></parent_chain><comment><author>deft</author><text>It is an instagram knockoff. I think they are avoiding the fediverse mention on purpose in order to get users by just marketing it as an alternative without any special magic. This about page as a landing page is really confusing but the homepage doesn&#x27;t do much better... Also registration is closed so this is a really strange time to be advertising</text></comment> |
4,672,645 | 4,672,594 | 1 | 2 | 4,672,167 | train | <story><title>Apple says no Java for you, removes plugin from browsers on OS X 10.7 and up</title><url>http://www.engadget.com/2012/10/18/apple-removes-java-from-osx/</url></story><parent_chain><item><author>borlak</author><text>Since I was the lone developer using windows at my workplace, I decided to give Apple/OSX a fair shake. The fact OSX was unix based seemed awesome to me (some of my projects are in C).<p>Then began the hell of trying to do anything "unixy" on OSX. Custom libraries just for the mac (custom libev??), needing to install XCode Dev Tools to get gcc compiler(???). Sigh.<p>In the end I just ended up running a CentOS VM in virtualbox, and I do my development there. The ONLY benefit I currently gain from a Mac is creating iOS apps.</text></item><item><author>rsync</author><text>I switched from FreeBSD as my desktop to OSX about four years ago because:<p>a) I needed a unix based desktop<p>b) I needed end user components (browser, printing, etc.) that just worked<p>Circa 2008, everything in Safari "just worked" - every site, every player, every piece of embedded bullshit on every little web 2.0 site blah blah blah.<p>But now I need to manually futz around with flash every two weeks to keep it working ... and now java as well ?<p>I think OSX still has an edge for me, in terms of getting things done, due to printing and ... ? It's getting awfully close to even, though.</text></item></parent_chain><comment><author>fauigerzigerk</author><text>OS X is indeed a rather nasty Unix platform right now because Apple in the midst of a long drawn out transition away from GCC. They are shipping a completely outdated GCC and an obsolete libstdc++, yet most libraries are still compiled with that ancient libstdc++.<p>I tried to use LLVM's libc++ instead but I had to recompile so many libraries and make sure the dynamic linker actually picks them up that it was just too big of a hassle. I don't see light at the end of that tunnel right now.<p>I'm thinking of giving Linux another try as a development platform.</text></comment> | <story><title>Apple says no Java for you, removes plugin from browsers on OS X 10.7 and up</title><url>http://www.engadget.com/2012/10/18/apple-removes-java-from-osx/</url></story><parent_chain><item><author>borlak</author><text>Since I was the lone developer using windows at my workplace, I decided to give Apple/OSX a fair shake. The fact OSX was unix based seemed awesome to me (some of my projects are in C).<p>Then began the hell of trying to do anything "unixy" on OSX. Custom libraries just for the mac (custom libev??), needing to install XCode Dev Tools to get gcc compiler(???). Sigh.<p>In the end I just ended up running a CentOS VM in virtualbox, and I do my development there. The ONLY benefit I currently gain from a Mac is creating iOS apps.</text></item><item><author>rsync</author><text>I switched from FreeBSD as my desktop to OSX about four years ago because:<p>a) I needed a unix based desktop<p>b) I needed end user components (browser, printing, etc.) that just worked<p>Circa 2008, everything in Safari "just worked" - every site, every player, every piece of embedded bullshit on every little web 2.0 site blah blah blah.<p>But now I need to manually futz around with flash every two weeks to keep it working ... and now java as well ?<p>I think OSX still has an edge for me, in terms of getting things done, due to printing and ... ? It's getting awfully close to even, though.</text></item></parent_chain><comment><author>astrodust</author><text>If you were on Windows you'd need to install Visual Studio just to get a compiler. XCode is similar. You haven't seen hell until you try and use Cygwin on Windows to get anything "unixy" done.<p>It sounds like you weren't using MacPorts or Homebrew to build your libraries. They're a lot closer to the distributions you see on Linux and BSD.</text></comment> |
14,513,754 | 14,513,362 | 1 | 3 | 14,511,627 | train | <story><title>A subway-style diagram of the major Roman roads, based on the Empire ca. 125 AD</title><url>http://sashat.me/2017/06/03/roman-roads/</url></story><parent_chain><item><author>johan_larson</author><text>It&#x27;s strange there are so many coastal routes. Shipping virtually anything by sea has been cheaper than moving it over land for a long time, and that probably includes troops. I would have expected roads to connect coastal settlements inland, not along the coast.</text></item></parent_chain><comment><author>azernik</author><text>Going off of much later European experience, this depended on the type of traffic.<p>Despite the speed and cost&#x2F;weight advantages of sea travel, the Habsburg Spanish empire maintained a land route between their possessions in Italy and the Netherlands [1]. Road travel had the advantage of reliability - more resistant to both natural disasters and to enemy action in war. This was very important for moving military forces through a large empire, which was a very important consideration for a polity like Rome which was constantly moving troops around to fight some revolt, war of expansion, or war of defense. There were also civilian applications - for small, expensive, non-time-sensitive cargoes.<p>In general, see [2] if you&#x27;re interested in the transportation network of ancient time - see what combinations of weather, transport preferences (passenger carriage vs. donkey, for example, or safer daylight-only sailing vs. more efficient all-day sailing) push traffic onto coastal roads.<p>[1] <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Spanish_Road" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Spanish_Road</a><p>[2] <a href="http:&#x2F;&#x2F;orbis.stanford.edu" rel="nofollow">http:&#x2F;&#x2F;orbis.stanford.edu</a><p>EDIT: For example, let&#x27;s take the example of travel from Aelia Capitolina (Jerusalem) to Tyrus (Tyre&#x2F;Sur). For a military application (travelling at military march speeds (60km&#x2F;day) but restricting yourself to daylight sailing for fear of shipwrecks) relative travel speeds depend on weather - roads are better in winter, sea in summer. If you add in transfer time and cost (finding or rendezvousing with ships, cross-loading cargo, hiring porters) then the relative transfer times and costs change yet again.<p>In general, roads were better for: shorter trips, where transfer times and costs dominate; trips involving faster means of land travel, such as military forced marches, passenger travel by carriage, or in the extreme message passing by horse relay; and trips where protection from weather and enemy action (pirate or military) was paramount. Whereas bulk cargo of relatively low value, such as the massive grain shipments from Egypt to Rome, was only practical by sea.</text></comment> | <story><title>A subway-style diagram of the major Roman roads, based on the Empire ca. 125 AD</title><url>http://sashat.me/2017/06/03/roman-roads/</url></story><parent_chain><item><author>johan_larson</author><text>It&#x27;s strange there are so many coastal routes. Shipping virtually anything by sea has been cheaper than moving it over land for a long time, and that probably includes troops. I would have expected roads to connect coastal settlements inland, not along the coast.</text></item></parent_chain><comment><author>te_chris</author><text>It took hundreds of years for sea voyages at scale across the med to become a thing, as technology improved. The reach of the empire in this diagram is mostly due to Roman sea supremacy, but it was still difficult, expensive and prone to piracy&#x2F;uppity city states.<p>If you&#x27;re interested I highly recommend this book: <a href="https:&#x2F;&#x2F;www.amazon.co.uk&#x2F;d&#x2F;cka&#x2F;Sea-Civilization-Maritime-History-Lincoln-P-Paine&#x2F;1782393587" rel="nofollow">https:&#x2F;&#x2F;www.amazon.co.uk&#x2F;d&#x2F;cka&#x2F;Sea-Civilization-Maritime-His...</a></text></comment> |
4,158,498 | 4,158,464 | 1 | 2 | 4,157,777 | train | <story><title>FuckItJS: Runs your javascript code whether your compiler likes it or not.</title><url>https://github.com/mattdiamond/fuckitjs/</url></story><parent_chain><item><author>jackcviers3</author><text>The software license is the best part:<p>"If you are caught in a dire situation wherein you only have enough time to save one person out of a group, and the Author is a member of that group, you must save the Author."</text></item></parent_chain><comment><author>masklinn</author><text>I don't know, I really liked the `FuckIt.moreConflict()` method, especially when I read the source and realized he actually tested it and ensured `window.location` would not get overwritten (or you'd navigate away from the page)</text></comment> | <story><title>FuckItJS: Runs your javascript code whether your compiler likes it or not.</title><url>https://github.com/mattdiamond/fuckitjs/</url></story><parent_chain><item><author>jackcviers3</author><text>The software license is the best part:<p>"If you are caught in a dire situation wherein you only have enough time to save one person out of a group, and the Author is a member of that group, you must save the Author."</text></item></parent_chain><comment><author>51Cards</author><text>Brilliant... This is going in every software license and terms agreement I write from here on out. What a tremendous insurance policy if I perhaps hit on the next Facebook. 1/7th of the planet responsible for saving lil ol' me.</text></comment> |
35,361,364 | 35,361,636 | 1 | 2 | 35,359,271 | train | <story><title>The teen mental illness epidemic is international – Part 1: The Anglosphere</title><url>https://jonathanhaidt.substack.com/p/international-mental-illness-part-one</url></story><parent_chain><item><author>grammers</author><text>The article has a point, but there&#x27;s no proof for cause &amp; effect.<p>A completely different explanation could be that our societies are so advanced by now that we can finally listen to mental illnesses and take them seriously - while in the past people just had to &#x27;function&#x27;, no matter what.</text></item></parent_chain><comment><author>cornholio</author><text>While I agree there is no statistical proof in the article, it&#x27;s a very strong hypothesis: we somehow only &#x27;listen&#x27; to the mental problems of female teens, who see geometric increases in self harm, suicide and various disorders, twice or three times the pre-social media levels. Yet other categories see linear or token increases.<p>Could it be just a coincidence that young females are exactly the demographic that is constrained by a gender role where aesthetic appeal and social interactions are the most valuable assets? And those are exactly the type of things that social networks exploited, monetized and massively gamified in the last decade?<p>What&#x27;s more likely: a rapid change of the cultural norms and roles associated with growing up as a woman or of those related to recognizing and treating mental health issues (all in a single decade!); OR: a purely technical revolution that put interactive screens in the hands of each kid and made the former much more effective in harming their development and leading to the latter?<p>At least for the young female demographic, I think there is a massive burden of proof for anyone claiming the epidemics <i>is not social-media induced</i>.</text></comment> | <story><title>The teen mental illness epidemic is international – Part 1: The Anglosphere</title><url>https://jonathanhaidt.substack.com/p/international-mental-illness-part-one</url></story><parent_chain><item><author>grammers</author><text>The article has a point, but there&#x27;s no proof for cause &amp; effect.<p>A completely different explanation could be that our societies are so advanced by now that we can finally listen to mental illnesses and take them seriously - while in the past people just had to &#x27;function&#x27;, no matter what.</text></item></parent_chain><comment><author>paganel</author><text>&gt; A completely different explanation could be that our societies are so advanced by now that we can finally listen to mental illnesses and take them seriously<p>I lived in the Eastern Europe of the 1980s and the 1990s, not the best of times, economically speaking. I used to play football with Rroma children that were walking strange, at least that&#x27;s how it looked to child-me, only later to find out that most probably they had been afflicted by polio in the past. All this to say that things were tough.<p>Even so, as a kid back then I had no acquaintances of my age who were harming themselves or, the worst of all, who were off-ing themselves. We would have known, kids used to know this sort of stuff because we were almost always outside, playing together.<p>Suffice is to say that things are now totally different. I&#x27;ve heard of kids harming themselves at 11-12 years of age and I know of a young lady who took her own life (at 16 or 17). Again, that was unimaginable 30 to 40 years ago.</text></comment> |
13,493,614 | 13,491,645 | 1 | 3 | 13,490,866 | train | <story><title>Stanford historian uncovers a grim correlation between violence and inequality</title><url>http://news.stanford.edu/2017/01/24/stanford-historian-uncovers-grim-correlation-violence-inequality-millennia/</url></story><parent_chain><item><author>jasim</author><text>Here&#x27;s another reason for unions to become defunct over time, as explained by Steve Bruce in his book Sociology: A Very Short Introduction.<p>&quot;Any kind of group activity requires organization. But as soon as one starts to organize one creates a division within the movement between the organized and the organizers, between ordinary members and officials. The latter quickly acquire knowledge and expertise that set them apart and give them power over ordinary members. The officials begin to derive personal satisfaction from their place in the organization and seek ways of consolidating it. They acquire an interest in the continued prosperity of the organization. For ordinary trade unionists, their union is just one interest in which they have a small stake. But for the paid officials the union is their employer. Preserving the organization becomes more important than helping it achieve its goals. As radical action may bring government repression, the apparatchiks moderate.&quot;</text></item><item><author>perrick</author><text>Louis Chauvel, a french sociology professor, made a similar argument using a Prey-Predatory model with trade union membership and income inequality over the last 100 years :<p>&quot;This article shows empirically how trade union membership and income inequality are mutually related in twelve countries over more than 100 years. While past research has shown that high income inequality occurs alongside low trade union membership, we show that past income inequality actually increases trade union membership with a time lag, as trade unions recruit more members after inequality has been high. But we also show that strengthened trade unions then fight inequality, thereby destroying what helped them to recruit new members in the past. As trade union density decreases, inequality increases and eventually re-incentivises workers to join unions again. By showing this empirically, we reconceptualise the relationship between inequality and union density as a prey and predator model, where predators eat prey – unions destroy inequality, but thereby also destroy their own basis for survival. By empirically showing that trade union density and social inequality influence each other in this way over long periods, this article contributes to a dynamic approach on how social problems and social movements interact.&quot;<p><a href="http:&#x2F;&#x2F;onlinelibrary.wiley.com&#x2F;doi&#x2F;10.1111&#x2F;kykl.12128&#x2F;full" rel="nofollow">http:&#x2F;&#x2F;onlinelibrary.wiley.com&#x2F;doi&#x2F;10.1111&#x2F;kykl.12128&#x2F;full</a></text></item></parent_chain><comment><author>208Gi</author><text>&quot;But as soon as one starts to organize one creates a division within the movement between the organized and the organizers, between ordinary members and officials. The officials begin to derive personal satisfaction from their place in the organization and seek ways of consolidating it.&quot;<p>There have always been unions that organized in a manner that avoids this situation. (The CNT, the IWW, the Zapatistas (1990s) and, more.) In fact, this act of holding on to power has always been one of the major Anarchist critiques of Marxism. And, Anarchists were very present in the unions of the early 1900s.<p>One of Capitalism&#x27;s and Authoritative Communism&#x27;s finest achievements has been burying the history of unions. They&#x27;ve done it so well that many current unions are unaware of alternative power structures.</text></comment> | <story><title>Stanford historian uncovers a grim correlation between violence and inequality</title><url>http://news.stanford.edu/2017/01/24/stanford-historian-uncovers-grim-correlation-violence-inequality-millennia/</url></story><parent_chain><item><author>jasim</author><text>Here&#x27;s another reason for unions to become defunct over time, as explained by Steve Bruce in his book Sociology: A Very Short Introduction.<p>&quot;Any kind of group activity requires organization. But as soon as one starts to organize one creates a division within the movement between the organized and the organizers, between ordinary members and officials. The latter quickly acquire knowledge and expertise that set them apart and give them power over ordinary members. The officials begin to derive personal satisfaction from their place in the organization and seek ways of consolidating it. They acquire an interest in the continued prosperity of the organization. For ordinary trade unionists, their union is just one interest in which they have a small stake. But for the paid officials the union is their employer. Preserving the organization becomes more important than helping it achieve its goals. As radical action may bring government repression, the apparatchiks moderate.&quot;</text></item><item><author>perrick</author><text>Louis Chauvel, a french sociology professor, made a similar argument using a Prey-Predatory model with trade union membership and income inequality over the last 100 years :<p>&quot;This article shows empirically how trade union membership and income inequality are mutually related in twelve countries over more than 100 years. While past research has shown that high income inequality occurs alongside low trade union membership, we show that past income inequality actually increases trade union membership with a time lag, as trade unions recruit more members after inequality has been high. But we also show that strengthened trade unions then fight inequality, thereby destroying what helped them to recruit new members in the past. As trade union density decreases, inequality increases and eventually re-incentivises workers to join unions again. By showing this empirically, we reconceptualise the relationship between inequality and union density as a prey and predator model, where predators eat prey – unions destroy inequality, but thereby also destroy their own basis for survival. By empirically showing that trade union density and social inequality influence each other in this way over long periods, this article contributes to a dynamic approach on how social problems and social movements interact.&quot;<p><a href="http:&#x2F;&#x2F;onlinelibrary.wiley.com&#x2F;doi&#x2F;10.1111&#x2F;kykl.12128&#x2F;full" rel="nofollow">http:&#x2F;&#x2F;onlinelibrary.wiley.com&#x2F;doi&#x2F;10.1111&#x2F;kykl.12128&#x2F;full</a></text></item></parent_chain><comment><author>germinalphrase</author><text>Of course, many small - and still influential - unions are actively made up of their members. For instance, teachers unions (outside of large metropolitan areas) commonly vote their presidents and other officials directly from the teaching ranks. These leaders often receive little&#x2F;no compensation for their additional duties.</text></comment> |
23,957,257 | 23,956,297 | 1 | 2 | 23,955,596 | train | <story><title>How to effectively evade the GDPR and the reach of the DPA</title><url>https://blog.zoller.lu/2020/05/how-to-effectively-evade-gdpr-and-reach.html</url></story><parent_chain><item><author>garmaine</author><text>Does RocketReach have servers in the EU? Employees? Subsidiaries?<p>I generally don’t know in this case. But in general my European friends seem to think that merely having someone from the EU access a website makes that website’s owner have a presence in the EU, even if the server that handled it isn’t. That seems like overreach to me. If that were the case, I’d block EU access for any of my domains, and I don’t think we want a future where that becomes the norm. The ideals of the Internet are free exchange of ideas and information, no country-specific walled gardens.</text></item><item><author>josefx</author><text>&gt; The EU doesn&#x27;t have such status or power over US companies.<p>US companies operating in the EU are subject to EU law. Worst case the company itself doesn&#x27;t operate in the EU, however that still leaves its customers (Intel, AirBnB, etc. ) potential targets to apply pressure on.</text></item><item><author>oarsinsync</author><text>I&#x27;m not sure how I feel about the screenshot at the end, showing that various policy makers also have their personal information being sold.<p>I guess the information is out there, and doing so also makes it definitively personal for the policy makers &#x2F; enforcers involved.<p>That said, the policy makers &#x2F; enforcers may be genuinely hamstrung. The US imposes its laws globally because of it&#x27;s status as a global reserve currency (trading in USD requires the transaction to route via the US, thus making the entity subject to US law).<p>The EU doesn&#x27;t have such status or power over US companies. The most it can do is try to prevent them from operating in the region.<p>As a person who almost certainly has his personal information being sold on this platform, I&#x27;m not pleased, and would love to see something done to prevent this kind of activity. Unfortunately, that depends on the US government to take action, and the last 12 years haven&#x27;t been a flying endorsement of the effectiveness of the current government system. (This is not meant as an statement regarding the effectiveness of either President, but rather a regarding the low output from the system as a whole)</text></item></parent_chain><comment><author>tomxor</author><text>&gt; The ideals of the Internet are free exchange of ideas and information, no country-specific walled gardens.<p>Your argument reduces to &quot;freedom of speech&quot; == &quot;freedom to take and distribute personal information&quot; (They are not equal).<p>Your walled gardens cherry on top only highlights the deficiencies that some countries have to protect personal information - Saying this is making the internet into walled gardens is like promoting tax evasion by using Ireland (in this case the US == Ireland, because it is deficient)</text></comment> | <story><title>How to effectively evade the GDPR and the reach of the DPA</title><url>https://blog.zoller.lu/2020/05/how-to-effectively-evade-gdpr-and-reach.html</url></story><parent_chain><item><author>garmaine</author><text>Does RocketReach have servers in the EU? Employees? Subsidiaries?<p>I generally don’t know in this case. But in general my European friends seem to think that merely having someone from the EU access a website makes that website’s owner have a presence in the EU, even if the server that handled it isn’t. That seems like overreach to me. If that were the case, I’d block EU access for any of my domains, and I don’t think we want a future where that becomes the norm. The ideals of the Internet are free exchange of ideas and information, no country-specific walled gardens.</text></item><item><author>josefx</author><text>&gt; The EU doesn&#x27;t have such status or power over US companies.<p>US companies operating in the EU are subject to EU law. Worst case the company itself doesn&#x27;t operate in the EU, however that still leaves its customers (Intel, AirBnB, etc. ) potential targets to apply pressure on.</text></item><item><author>oarsinsync</author><text>I&#x27;m not sure how I feel about the screenshot at the end, showing that various policy makers also have their personal information being sold.<p>I guess the information is out there, and doing so also makes it definitively personal for the policy makers &#x2F; enforcers involved.<p>That said, the policy makers &#x2F; enforcers may be genuinely hamstrung. The US imposes its laws globally because of it&#x27;s status as a global reserve currency (trading in USD requires the transaction to route via the US, thus making the entity subject to US law).<p>The EU doesn&#x27;t have such status or power over US companies. The most it can do is try to prevent them from operating in the region.<p>As a person who almost certainly has his personal information being sold on this platform, I&#x27;m not pleased, and would love to see something done to prevent this kind of activity. Unfortunately, that depends on the US government to take action, and the last 12 years haven&#x27;t been a flying endorsement of the effectiveness of the current government system. (This is not meant as an statement regarding the effectiveness of either President, but rather a regarding the low output from the system as a whole)</text></item></parent_chain><comment><author>himinlomax</author><text>My understanding is that merely having a website that can be accessed from the EU may not by itself be enough to be subject to the GDPR. However collecting or processing data on EU citizens or residents certainly is. And almost all websites track users (even when it&#x27;s not obviously useful to do so), so unless you go the USA Today route and create a site for the EU with no tracking, you have to comply.<p>There&#x27;s also the question of who they sell the data to. It&#x27;s hard to see why they would sell EU citizens&#x2F;residents data to companies who don&#x27;t have any EU presence themselves, so at least some of their customers are bound by the GDPR as far as these are concerned. Informed consent is required at every step, so for example they would need the EU subject&#x27;s consent to buy that data from RocketReach.</text></comment> |
13,397,531 | 13,394,578 | 1 | 2 | 13,392,885 | train | <story><title>VR</title><url>https://blog.ycombinator.com/vr/</url></story><parent_chain><item><author>aphextron</author><text>As someone who owns both headsets from day one and has been developing software for Vive, I&#x27;d honestly say the current generation of tech just isn&#x27;t worth it for most people. In five years when we have wireless headsets with eye tracking and full FOV displays with no discernible pixelation and the library of games are finally here it will be worth it. As it is most people would probably be let down after the initial wow factor wears off.<p>I think VR is at the point smartphones were from 2000-2007 until the iPhone showed up. It&#x27;s going to take another generation of devices that incorporate all of those features in a really well designed package before it goes mainstream.</text></item></parent_chain><comment><author>ensiferum</author><text>I completely agree with this sentiment. I too develop for both Vive and Oculus and they&#x27;re just so unfinished products that I&#x27;d never recommend anyone to buy them.<p>* Vive has huge problems with tracking devices. It needs to track 3 things and usually 1 has a problem and is not tracking properly <i>sigh</i>
* The coords are annoying
* FOV is too narrow
* picture quality is crap (low resolution)
* steam vr (as steam itself) is low quality software
* the device is heavy on your face, uncomfortable and makes you sweat (in the face.. nasty)
* lack of compelling content that has things <i>just</i> right (i.e. doesn&#x27;t make you sick)<p>From developer point of view:
* openvr library can be confusing, documentation is lacking and it&#x27;s married to steam :(<p>I think the author is overshooting the importance of VR. The next generation hardware will undoubtedly improve much and there&#x27;s definately potential especially in fields like visualization work and gaming too. But lets be honest there&#x27;s a whole bunch of basic &quot;productivity&quot; apps and light user content apps (think your average phone app) that really doesn&#x27;t have much to gain from VR. Undoubetedly some of these will want to jump on the VR hype (once it comes) and quickly make totally horrible half-assed versions of their software for VR.</text></comment> | <story><title>VR</title><url>https://blog.ycombinator.com/vr/</url></story><parent_chain><item><author>aphextron</author><text>As someone who owns both headsets from day one and has been developing software for Vive, I&#x27;d honestly say the current generation of tech just isn&#x27;t worth it for most people. In five years when we have wireless headsets with eye tracking and full FOV displays with no discernible pixelation and the library of games are finally here it will be worth it. As it is most people would probably be let down after the initial wow factor wears off.<p>I think VR is at the point smartphones were from 2000-2007 until the iPhone showed up. It&#x27;s going to take another generation of devices that incorporate all of those features in a really well designed package before it goes mainstream.</text></item></parent_chain><comment><author>rictic</author><text>I think that the Vive is a little like the release iPhone. It has all of the pieces, but it&#x27;s also clunky and extravagant.<p>You had to have a pretty good imagination to look at any 2004 phone and envision the iPhone, but once you had the 1st gen iPhone it&#x27;s pretty easy to imagine a modern smartphone. It&#x27;s pretty much just the same thing only more so.</text></comment> |
15,693,115 | 15,691,084 | 1 | 3 | 15,686,653 | train | <story><title>How Firefox Got Fast Again</title><url>https://hacks.mozilla.org/2017/11/entering-the-quantum-era-how-firefox-got-fast-again-and-where-its-going-to-get-faster/</url></story><parent_chain><item><author>Kluny</author><text>I gave Firefox a try a month or two ago when people started saying it was good again. I think I had a problem with uBlock origin not working well enough - too much crap was getting through. What are people using for adblock?</text></item><item><author>Santosh83</author><text>Indeed this is the main point, and the reason I&#x27;ve stuck with Firefox ever since it launched, despite being rather slow a few years back. This is the only browser engine that is not shaped by major corporate interests. And frankly it has no <i>major</i> downsides as compared to Chrome. The latter only enjoys the market share it does (in my opinion) because of being aggressively pushed on Google.com and being bundled everywhere.</text></item><item><author>tspike</author><text>Thanks for the reminder. I just downloaded the latest FF beta and exiled Chrome to an &quot;only when needed&quot; role.<p>All the back-and-forth about speed and features is understandable, but misses the point that Firefox needs our support if we are to have any real non-proprietary options for what is quickly becoming the base system abstraction layer for most computing.</text></item><item><author>mapgrep</author><text>I switched back to Firefox 54 from Chrome when multiprocess browsing (&quot;Electrolysis&quot;) came out of beta. It&#x27;s been absolutely great. It&#x27;s fast and I trust and like the nonprofit behind it. And all the extensions I care about are available.<p>My main issue with Chrome was the endless nags to sign in to a Google account, and just generally wanting less dependence on Google. I also like that Firefox has a built in tracking protection (not just Do Not Track toggle but actual blocking of trackers). That&#x27;s something that&#x27;s just not in Google&#x27;s interest to put in Chrome.<p>Browsers are becoming more and more aggressive in protecting the interests of users. Becoming true &quot;User Agents,&quot; in other words. See also Safari iOS allowing content blockers and now in iOS 11 blocking some popular tracking behaviors by default. It&#x27;s absolutely great. And it&#x27;s not surprising to me that Chrome is not a leader here. It&#x27;s owned by the biggest advertising company on the internet. I predict Chrome will continue to lag on pro-privacy, anti-nag features.</text></item></parent_chain><comment><author>mintplant</author><text>I think I know what you&#x27;re referring to, if you were using one of the unstable (beta, nightly, developer edition) channels with Firefox 57. There was a brief period I noticed during which the uBlock Origin WebExtension edition wasn&#x27;t fully up to par with the old XUL-based version, letting some ads slip through. That&#x27;s been fixed since then.</text></comment> | <story><title>How Firefox Got Fast Again</title><url>https://hacks.mozilla.org/2017/11/entering-the-quantum-era-how-firefox-got-fast-again-and-where-its-going-to-get-faster/</url></story><parent_chain><item><author>Kluny</author><text>I gave Firefox a try a month or two ago when people started saying it was good again. I think I had a problem with uBlock origin not working well enough - too much crap was getting through. What are people using for adblock?</text></item><item><author>Santosh83</author><text>Indeed this is the main point, and the reason I&#x27;ve stuck with Firefox ever since it launched, despite being rather slow a few years back. This is the only browser engine that is not shaped by major corporate interests. And frankly it has no <i>major</i> downsides as compared to Chrome. The latter only enjoys the market share it does (in my opinion) because of being aggressively pushed on Google.com and being bundled everywhere.</text></item><item><author>tspike</author><text>Thanks for the reminder. I just downloaded the latest FF beta and exiled Chrome to an &quot;only when needed&quot; role.<p>All the back-and-forth about speed and features is understandable, but misses the point that Firefox needs our support if we are to have any real non-proprietary options for what is quickly becoming the base system abstraction layer for most computing.</text></item><item><author>mapgrep</author><text>I switched back to Firefox 54 from Chrome when multiprocess browsing (&quot;Electrolysis&quot;) came out of beta. It&#x27;s been absolutely great. It&#x27;s fast and I trust and like the nonprofit behind it. And all the extensions I care about are available.<p>My main issue with Chrome was the endless nags to sign in to a Google account, and just generally wanting less dependence on Google. I also like that Firefox has a built in tracking protection (not just Do Not Track toggle but actual blocking of trackers). That&#x27;s something that&#x27;s just not in Google&#x27;s interest to put in Chrome.<p>Browsers are becoming more and more aggressive in protecting the interests of users. Becoming true &quot;User Agents,&quot; in other words. See also Safari iOS allowing content blockers and now in iOS 11 blocking some popular tracking behaviors by default. It&#x27;s absolutely great. And it&#x27;s not surprising to me that Chrome is not a leader here. It&#x27;s owned by the biggest advertising company on the internet. I predict Chrome will continue to lag on pro-privacy, anti-nag features.</text></item></parent_chain><comment><author>SSLy</author><text>uBlock <i>origin</i> with uMatrix</text></comment> |
15,694,369 | 15,694,211 | 1 | 2 | 15,693,906 | train | <story><title>Vim Cheat Sheet (2015)</title><url>http://vimsheet.com</url></story><parent_chain></parent_chain><comment><author>lorenzfx</author><text>To me, the most important thing about learning vim is its grammar (verbs and objects). Check out &quot;Your problem with Vim is that you don&#x27;t grok vi.&quot; [0] (one of stackoverflow&#x27;s best all time answers).<p>[0] <a href="https:&#x2F;&#x2F;stackoverflow.com&#x2F;a&#x2F;1220118&#x2F;2131903" rel="nofollow">https:&#x2F;&#x2F;stackoverflow.com&#x2F;a&#x2F;1220118&#x2F;2131903</a></text></comment> | <story><title>Vim Cheat Sheet (2015)</title><url>http://vimsheet.com</url></story><parent_chain></parent_chain><comment><author>erikbye</author><text><a href="https:&#x2F;&#x2F;i.imgur.com&#x2F;YLInLlY.png" rel="nofollow">https:&#x2F;&#x2F;i.imgur.com&#x2F;YLInLlY.png</a><p>EDIT: Much higher res.<p>Hang it on your (cubicle) wall.</text></comment> |
8,216,441 | 8,216,130 | 1 | 3 | 8,215,787 | train | <story><title>On bananas and string matching algorithms</title><url>http://www.wabbo.org/blog/2014/22aug_on_bananas.html</url></story><parent_chain><item><author>StefanKarpinski</author><text>Really interesting post and good debugging work. A couple of take-aways:<p>1. This is one reason it&#x27;s a good idea to use signed ints for lengths even though they can never be negative. Signed 64-bit ints have plenty of range for any array you&#x27;re actually going to encounter. It may also be evidence that it&#x27;s a good idea for mixed signed&#x2F;unsigned arithmetic to produce signed results rather than unsigned: signed tends to be value-correct for &quot;small&quot; values (less than 2^63), including negative results; unsigned sacrifices value-correctness on all negative values to be correct for <i>very</i> large values, which is a less common case – here it will never happen since there just aren&#x27;t strings that large.<p>2. If you&#x27;re going to use a fancy algorithm like two-way search, you really ought to have a lot of test cases, especially ones that exercise corner cases of the algorithm. 100% coverage of all non-error code paths would be ideal.</text></item></parent_chain><comment><author>haberman</author><text>I&#x27;m not sure I agree with #1. While it&#x27;s true that using signed integers for lengths would have solved this problem, a negative length doesn&#x27;t generally make sense or have any defined meaning. So if you have any logic that is doing case analysis on lengths, you basically have no reasonable action in the case of a negative length: all you can do is complain that the length got corrupted at some point.<p>For example, imagine that instead of testing whether the strings differed by more than 20, you were just testing whether the needle was &gt;= 20 in length:<p><pre><code> impl Searcher {
fn new(haystack: &amp;[u8], needle: &amp;[u8]) -&gt; Searcher {
match needle.len() {
0..20 =&gt; Naive(NaiveSearcher::New)
20..uint::MAX =&gt; TwoWay(TwoWaySearcher::New(needle))
_ =&gt; fail!(&quot;Negative length, shouldn&#x27;t happen!&quot;)
}
}
}
</code></pre>
Part of the feature of an unsigned type IMO is that it cannot have an out-of-domain value by definition.</text></comment> | <story><title>On bananas and string matching algorithms</title><url>http://www.wabbo.org/blog/2014/22aug_on_bananas.html</url></story><parent_chain><item><author>StefanKarpinski</author><text>Really interesting post and good debugging work. A couple of take-aways:<p>1. This is one reason it&#x27;s a good idea to use signed ints for lengths even though they can never be negative. Signed 64-bit ints have plenty of range for any array you&#x27;re actually going to encounter. It may also be evidence that it&#x27;s a good idea for mixed signed&#x2F;unsigned arithmetic to produce signed results rather than unsigned: signed tends to be value-correct for &quot;small&quot; values (less than 2^63), including negative results; unsigned sacrifices value-correctness on all negative values to be correct for <i>very</i> large values, which is a less common case – here it will never happen since there just aren&#x27;t strings that large.<p>2. If you&#x27;re going to use a fancy algorithm like two-way search, you really ought to have a lot of test cases, especially ones that exercise corner cases of the algorithm. 100% coverage of all non-error code paths would be ideal.</text></item></parent_chain><comment><author>QuantumChaos</author><text>Regarding 2., I&#x27;m going to assume they have at least some unit tests.<p>It&#x27;s not clear that even 100% coverage would have picked up this bug, since it doesn&#x27;t automatically appear as a result of taking a particular code path.<p>I think that comparing to a naive algorithm on a large number of strings, or using a fuzzing tool, might be helpful, but this might also slow down the test suite excessively.<p>I find unit tests are a bit like good form at the gym. Whenever someone gets injured, they think back to the last exercise they did and find some imperfection in their form. They then attribute their injury to not using perfect form. Similarly, whenever a bug appears, there is a tendency to find a unit test that would have caught the bug, and therefore attribute it to insufficient testing.<p>The real value of testing (and good form) can only be demonstrated by having a criteria for good testing and seeing if following this reduces the number of bugs.</text></comment> |
36,341,109 | 36,341,251 | 1 | 3 | 36,338,529 | train | <story><title>Effective Rust (2021)</title><url>https://www.lurklurk.org/effective-rust/</url></story><parent_chain><item><author>softirq</author><text>&gt; Beauty is a useless concept in a programing language.<p>Maybe not aesthetic beauty, but readability in many contexts matters more than performance. Code that isn&#x27;t readable hides bugs and can&#x27;t be maintained (FYI Rust doesn&#x27;t stop you from writing bugs into your code). A language like C or Go where you can fit most of the syntax and mechanisms into your head are simply easier to read and reason about than a language where you need a PhD in type theory to understand one signature.<p>&gt; If it is not memory safe, does not support static typing with algebraic data types, and does not have null safety it does not meet those minimum requirements and is not suitable for use.<p>You&#x27;d better stop interacting with technology then, because the vast majority of it is still running something compiled from C. We&#x27;re talking about control systems, life saving technology, technology that&#x27;s running in outer space.</text></item><item><author>brigadier132</author><text>&gt; I have to say that I find Rust just as ugly and cumbersome as C++<p>Ok, so you don&#x27;t like it aesthetically. Do you have problems writing unmaintainable software in it? What about incorrect software? Is there any specific feature in the language that you object to that will cause confusion and lead to people writing bugs?<p>C is a very beautiful and &quot;simple&quot; language, people also write a lot of security vulnerabilities with it.<p>Beauty is a useless concept in a programing language. Most of the time it just relates to someone&#x27;s bias towards what they are familiar with. A lot of people find Python &quot;beautiful&quot;, I&#x27;m unfortunately very familiar with python and as I&#x27;ve become more and more familiar with it I find it uglier and uglier.<p>Same with C, I remember all the many hours I&#x27;ve spent in valgrind debugging problems other people have made for me and I find it ugly too.<p>When I look at a programming language, I don&#x27;t think about aesthetic &quot;beauty&quot; or even &quot;simplicity&quot;.<p>I think, does this programming language allow me to represent the concepts I want to represent accurately and correctly?<p>If it is not memory safe, does not support static typing with algebraic data types, and does not have null safety it does not meet those minimum requirements and is not suitable for use.<p>Edit: I want to add, it&#x27;s not just accuracy and correctness that are important. Performance is very important too and many functional languages absolutely flounder because of strict immutability and the adoption of patterns that have terrible memory access patterns.</text></item><item><author>softirq</author><text>After writing primarily no standard library C for 15 years, I have to say that I find Rust just as ugly and cumbersome as C++ (not debating its safety guarantees). It seems like languages that add sufficiently advanced type&#x2F;macros systems always spiral into unwieldy beasts where you spend a bunch of your time arguing with the type systems and writing blog posts about some new piece of type theory to solve a problem you would have never have had with C. People just get greedier with deriving code and wanting more &quot;magic&quot; until every program only makes sense to its author.<p>I don&#x27;t think I will ever like kitchen sink languages. Experience has taught me that the most effective tool is the simplest one, which for most use cases today would be Go. For systems programming I just shudder to think how convoluted and hard to read things will become when we take already extremely complex code written in the simplest terms in C and port it to Rust.</text></item></parent_chain><comment><author>satvikpendem</author><text>&gt; <i>FYI Rust doesn&#x27;t stop you from writing bugs into your code</i><p>You are committing the perfect solution fallacy [0]. Rust won&#x27;t make you have no bugs in your code but don&#x27;t conflate some number of bugs with a reduced number of bugs, a reduction is still a meaningful outcome, otherwise we&#x27;d still be writing in assembly.<p>&gt; <i>We&#x27;re talking about control systems, life saving technology, technology that&#x27;s running in outer space.</i><p>Indeed, that&#x27;s why we <i>should</i> use more memory safe languages, because we are dealing with critical systems. Just because they were written in C does not mean that they should continue to be. It&#x27;s like digging a hole with a stick, and when someone suggests a shovel or backhoe, you mention that all previous holes were made with a stick. There is no relationship between the previous work and what should be done in the future.<p>[0] <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Nirvana_fallacy#Perfect_solution_fallacy" rel="nofollow noreferrer">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Nirvana_fallacy#Perfect_soluti...</a></text></comment> | <story><title>Effective Rust (2021)</title><url>https://www.lurklurk.org/effective-rust/</url></story><parent_chain><item><author>softirq</author><text>&gt; Beauty is a useless concept in a programing language.<p>Maybe not aesthetic beauty, but readability in many contexts matters more than performance. Code that isn&#x27;t readable hides bugs and can&#x27;t be maintained (FYI Rust doesn&#x27;t stop you from writing bugs into your code). A language like C or Go where you can fit most of the syntax and mechanisms into your head are simply easier to read and reason about than a language where you need a PhD in type theory to understand one signature.<p>&gt; If it is not memory safe, does not support static typing with algebraic data types, and does not have null safety it does not meet those minimum requirements and is not suitable for use.<p>You&#x27;d better stop interacting with technology then, because the vast majority of it is still running something compiled from C. We&#x27;re talking about control systems, life saving technology, technology that&#x27;s running in outer space.</text></item><item><author>brigadier132</author><text>&gt; I have to say that I find Rust just as ugly and cumbersome as C++<p>Ok, so you don&#x27;t like it aesthetically. Do you have problems writing unmaintainable software in it? What about incorrect software? Is there any specific feature in the language that you object to that will cause confusion and lead to people writing bugs?<p>C is a very beautiful and &quot;simple&quot; language, people also write a lot of security vulnerabilities with it.<p>Beauty is a useless concept in a programing language. Most of the time it just relates to someone&#x27;s bias towards what they are familiar with. A lot of people find Python &quot;beautiful&quot;, I&#x27;m unfortunately very familiar with python and as I&#x27;ve become more and more familiar with it I find it uglier and uglier.<p>Same with C, I remember all the many hours I&#x27;ve spent in valgrind debugging problems other people have made for me and I find it ugly too.<p>When I look at a programming language, I don&#x27;t think about aesthetic &quot;beauty&quot; or even &quot;simplicity&quot;.<p>I think, does this programming language allow me to represent the concepts I want to represent accurately and correctly?<p>If it is not memory safe, does not support static typing with algebraic data types, and does not have null safety it does not meet those minimum requirements and is not suitable for use.<p>Edit: I want to add, it&#x27;s not just accuracy and correctness that are important. Performance is very important too and many functional languages absolutely flounder because of strict immutability and the adoption of patterns that have terrible memory access patterns.</text></item><item><author>softirq</author><text>After writing primarily no standard library C for 15 years, I have to say that I find Rust just as ugly and cumbersome as C++ (not debating its safety guarantees). It seems like languages that add sufficiently advanced type&#x2F;macros systems always spiral into unwieldy beasts where you spend a bunch of your time arguing with the type systems and writing blog posts about some new piece of type theory to solve a problem you would have never have had with C. People just get greedier with deriving code and wanting more &quot;magic&quot; until every program only makes sense to its author.<p>I don&#x27;t think I will ever like kitchen sink languages. Experience has taught me that the most effective tool is the simplest one, which for most use cases today would be Go. For systems programming I just shudder to think how convoluted and hard to read things will become when we take already extremely complex code written in the simplest terms in C and port it to Rust.</text></item></parent_chain><comment><author>plandis</author><text>C is not what I would call a very readable language. I don’t think you could drop someone who newly learned C and have them be effective looking at glibc for example.</text></comment> |
33,134,881 | 33,134,107 | 1 | 2 | 33,134,059 | train | <story><title>Show HN: Reflame – Deploy your React web apps in milliseconds</title><url>https://reflame.app/?source=show-hn-launch</url><text>Hi HN! I&#x27;ve been working on Reflame since I quit my job at Brex last year, excited to finally open it up for everybody to try out! Here&#x27;s a demo: <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=SohUnrjiIxk" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=SohUnrjiIxk</a><p>Reflame deploys client-rendered React web apps instantly, to previews and to production.<p>In concrete wall-clock terms, deploys generally take:<p>- ~50-500ms from our VSCode extension<p>- ~500-3000ms from our GitHub app<p>(Jump to this comment (<a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=33134082" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=33134082</a>) for what makes Reflame so fast)<p>The Reflame GitHub App automatically deploys default branches to production, and other branches to previews. If you&#x27;ve used Netlify&#x2F;Vercel&#x27;s GitHub apps, you should feel right at home. The difference is it’s multiple orders of magnitudes faster. Fast enough that <i>you&#x27;ll probably never see an in-progress deploy on GitHub ever again</i>, only ready-to-go preview&#x2F;production links.<p>No more having to babysit builds or having to context switch to and from other tasks before being able to see our changes deployed in previews or production. Previewing, sharing, and even shipping, can now become part of the so-called inner loop, giving us the superpower to stay in flow state for much longer.<p>The Reflame VSCode extension is yet another order of magnitude faster than even the GitHub App. It was designed to offer an experience that can rival local development workflows in both speed and ergonomics, while addressing many of local dev&#x27;s limitations around collaboration and production-parity. Every time we make a change (e.g. by saving a file), the extension will deploy that change (in ~50-500ms) to a &quot;Live Preview&quot;, and will immediately update the app in our browsers to reflect that change.<p>Live Previews can operate in one of two modes:<p>- Development mode delivers updates through React Fast Refresh, offering the familiar state-preserving instant feedback loop we know and love from local development workflows.<p>- Production mode delivers updates by triggering a full browser reload on every change, and in exchange for this extra bit of friction, we get to develop against a byte-identical version of the fully optimized production deployment that customers will see once we ship, with a tighter feedback loop than was ever possible before.<p>Live Previews deliver updates over the internet, meaning we can effortlessly test out our changes on multiple devices simultaneously, and show our changes to anyone in the world, just by sharing a Live Preview link, all while having our updates reflected automatically across all connected devices in real-time (with live reload or React Fast Refresh <i>over the internet</i>).<p>Being able to ship quickly is valuable on its own, but Reflame&#x27;s true north star has always been to enable customers to ship quickly <i>with confidence</i>.<p>One way Reflame helps customers ship with more confidence today is by making previews with full production-parity available at every step of the development process. Previews in Reflame are accessible at the exact same URL customers will use to access the production deployment, instead of at a different subdomain for each preview (i.e. every preview is accessed through <a href="https:&#x2F;&#x2F;reflame.app" rel="nofollow">https:&#x2F;&#x2F;reflame.app</a> instead of at <a href="https:&#x2F;&#x2F;some-branch-of-reflamedotapp.reflame-previews.dev" rel="nofollow">https:&#x2F;&#x2F;some-branch-of-reflamedotapp.reflame-previews.dev</a>). Behind the scenes, this is implemented using session cookies that our CDN will check to determine which version of the app to serve.<p>This is only the tip of the iceberg. We have some really exciting prototypes around testing and typechecking that we&#x27;ve been exploring that could allow us to ship with even more confidence <i>without ever slowing us down</i>.<p>If any of this sounds interesting for the apps you&#x27;re building or planning to build (taking into account this comment (<a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=33134092" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=33134092</a>) below describing what Reflame is not well suited for), please sign up and give it a try!<p>I can&#x27;t wait to see what you’ll build with it! :)</text></story><parent_chain></parent_chain><comment><author>solardev</author><text>This sounds really interesting and is a problem we were trying to solve just this week with Vercel!<p>However, I am having a hard time following what it&#x27;s actually doing under the hood. Aside from a single video, might you be willing to share some more details on the product page about how it works? I can&#x27;t tell what it&#x27;s doing, and I&#x27;m not inclined to randomly sign up for something without even having a high-level understanding...<p>Like when I push to Github, I understand that Vercel pulls the repo and does a yarn install and yarn build or whatever -- the same buildchain I use locally, just on their servers. Same with CircleCI.<p>Is that also what Reflame is doing? If so, what&#x27;s it doing differently that makes it faster? If not, how does it deal with node_modules installation and are there are any caveats there, like how does it deal with resolutions, lockfiles, private repos, etc.?<p>Or am I completely misunderstanding...? Is this some sort of content diffing engine &#x2F; hot reloader instead of a faster build pipeline? I&#x27;m not really sure I&#x27;m following, sorry :(<p>Just trying to understand what your product actually is and how it works :)</text></comment> | <story><title>Show HN: Reflame – Deploy your React web apps in milliseconds</title><url>https://reflame.app/?source=show-hn-launch</url><text>Hi HN! I&#x27;ve been working on Reflame since I quit my job at Brex last year, excited to finally open it up for everybody to try out! Here&#x27;s a demo: <a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=SohUnrjiIxk" rel="nofollow">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=SohUnrjiIxk</a><p>Reflame deploys client-rendered React web apps instantly, to previews and to production.<p>In concrete wall-clock terms, deploys generally take:<p>- ~50-500ms from our VSCode extension<p>- ~500-3000ms from our GitHub app<p>(Jump to this comment (<a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=33134082" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=33134082</a>) for what makes Reflame so fast)<p>The Reflame GitHub App automatically deploys default branches to production, and other branches to previews. If you&#x27;ve used Netlify&#x2F;Vercel&#x27;s GitHub apps, you should feel right at home. The difference is it’s multiple orders of magnitudes faster. Fast enough that <i>you&#x27;ll probably never see an in-progress deploy on GitHub ever again</i>, only ready-to-go preview&#x2F;production links.<p>No more having to babysit builds or having to context switch to and from other tasks before being able to see our changes deployed in previews or production. Previewing, sharing, and even shipping, can now become part of the so-called inner loop, giving us the superpower to stay in flow state for much longer.<p>The Reflame VSCode extension is yet another order of magnitude faster than even the GitHub App. It was designed to offer an experience that can rival local development workflows in both speed and ergonomics, while addressing many of local dev&#x27;s limitations around collaboration and production-parity. Every time we make a change (e.g. by saving a file), the extension will deploy that change (in ~50-500ms) to a &quot;Live Preview&quot;, and will immediately update the app in our browsers to reflect that change.<p>Live Previews can operate in one of two modes:<p>- Development mode delivers updates through React Fast Refresh, offering the familiar state-preserving instant feedback loop we know and love from local development workflows.<p>- Production mode delivers updates by triggering a full browser reload on every change, and in exchange for this extra bit of friction, we get to develop against a byte-identical version of the fully optimized production deployment that customers will see once we ship, with a tighter feedback loop than was ever possible before.<p>Live Previews deliver updates over the internet, meaning we can effortlessly test out our changes on multiple devices simultaneously, and show our changes to anyone in the world, just by sharing a Live Preview link, all while having our updates reflected automatically across all connected devices in real-time (with live reload or React Fast Refresh <i>over the internet</i>).<p>Being able to ship quickly is valuable on its own, but Reflame&#x27;s true north star has always been to enable customers to ship quickly <i>with confidence</i>.<p>One way Reflame helps customers ship with more confidence today is by making previews with full production-parity available at every step of the development process. Previews in Reflame are accessible at the exact same URL customers will use to access the production deployment, instead of at a different subdomain for each preview (i.e. every preview is accessed through <a href="https:&#x2F;&#x2F;reflame.app" rel="nofollow">https:&#x2F;&#x2F;reflame.app</a> instead of at <a href="https:&#x2F;&#x2F;some-branch-of-reflamedotapp.reflame-previews.dev" rel="nofollow">https:&#x2F;&#x2F;some-branch-of-reflamedotapp.reflame-previews.dev</a>). Behind the scenes, this is implemented using session cookies that our CDN will check to determine which version of the app to serve.<p>This is only the tip of the iceberg. We have some really exciting prototypes around testing and typechecking that we&#x27;ve been exploring that could allow us to ship with even more confidence <i>without ever slowing us down</i>.<p>If any of this sounds interesting for the apps you&#x27;re building or planning to build (taking into account this comment (<a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=33134092" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=33134092</a>) below describing what Reflame is not well suited for), please sign up and give it a try!<p>I can&#x27;t wait to see what you’ll build with it! :)</text></story><parent_chain></parent_chain><comment><author>lewisl9029</author><text>(posting some extras as comments since the original post is already way too long)<p>What&#x27;s the pricing going to be like?<p>Pricing is still very much a work in progress. Feedback from early customers will play a major factor here. Here&#x27;s my current thinking:<p>Reflame should be free to use for individuals deploying apps that live on the default *.reflame.dev subdomains. Deploying apps to custom domains will probably require a monthly fee, which will enable custom domains on all apps of the user (i.e. there won&#x27;t be an additional fee for having more than 1 app with custom domains). For organizations, there will be a simple monthly fee per member, for any number of apps with custom domains in that org.<p>I&#x27;m planning to avoid usage-based pricing for the deployment side of Reflame if at all possible, because I don’t think it leads to a good alignment of incentives. I don&#x27;t want to financially disincentivize customers from deploying as often as they want (within reasonable limits to protect against DoS), and I really don&#x27;t want to be financially disincentivized to make customers&#x27; deploys faster over time. Every usage-based CI platform out there benefits financially from our builds getting slower, so have no financial incentive to build anything to make them faster. I believe usage-based pricing for CI and the diametrically opposed incentives it introduced has been a huge driver of stagnation in the industry.</text></comment> |
18,929,751 | 18,929,824 | 1 | 3 | 18,928,083 | train | <story><title>What’s Causing the Rise of Hoarding Disorder?</title><url>https://daily.jstor.org/whats-causing-the-rise-of-hoarding-disorder/</url></story><parent_chain><item><author>im3w1l</author><text>Smaller living space means that you can&#x27;t save the thing you will need once every five year anymore. A previously adaptive behavior becomes poorly adaptive.<p>Cheap disposable items. Where before you would have a few hand made item of high quality, you now get a lot of lower quality items. This is partially related to technological advancement. If things become obsolete fast, it doesn&#x27;t make sense to build with quality. Anyway a consequence is that it&#x27;s easier to acquire a large amount of stuff than before.<p>Anti-landfill propaganda. We are guilt-tripped for throwing stuff in the garbage. We are told to dispose of things in very complicated ways and then it may be easier to just not dispose of it.<p>Breakdown of community and family. Before we might keep things around by giving them away to relatives or friends. There is a satisfaction in passing the torch. But this option isn&#x27;t as available anymore.</text></item></parent_chain><comment><author>jdietrich</author><text><i>&gt;it&#x27;s easier to acquire a large amount of stuff than before</i><p>I suspect this is <i>the</i> factor. A behaviour that is adaptive in an environment of scarcity becomes maladaptive in an environment of abundance.<p>When my grandmother was a child, practically the only thing that genuinely counted as &quot;garbage&quot; was ash from the fire. Pretty much everything else had a meaningful value. Scraps of paper could light the fire. Scraps of cloth could make a quilt or a rug. Vegetable peelings went to the pigs and not a morsel of edible food was wasted. Packaging wasn&#x27;t a word anyone was familiar with, but boxes and tins would be saved for re-use. Furniture was repaired and re-repaired until it was only good for firewood.<p>A lot of people were essentially raised to be hoarders, either through direct experience or transmission of those values from their parents. That mindset isn&#x27;t irrational, but it&#x27;s a poor match for the 21st century. It&#x27;s all too easy to lose sight of the purpose of those values and hoard for the sake of hoarding.<p>I think that similar factors explain a significant part of the obesity epidemic. The scarcity-era virtues of clearing your plate and being a generous host become vices in a world of supersized portions and supermarket offers.</text></comment> | <story><title>What’s Causing the Rise of Hoarding Disorder?</title><url>https://daily.jstor.org/whats-causing-the-rise-of-hoarding-disorder/</url></story><parent_chain><item><author>im3w1l</author><text>Smaller living space means that you can&#x27;t save the thing you will need once every five year anymore. A previously adaptive behavior becomes poorly adaptive.<p>Cheap disposable items. Where before you would have a few hand made item of high quality, you now get a lot of lower quality items. This is partially related to technological advancement. If things become obsolete fast, it doesn&#x27;t make sense to build with quality. Anyway a consequence is that it&#x27;s easier to acquire a large amount of stuff than before.<p>Anti-landfill propaganda. We are guilt-tripped for throwing stuff in the garbage. We are told to dispose of things in very complicated ways and then it may be easier to just not dispose of it.<p>Breakdown of community and family. Before we might keep things around by giving them away to relatives or friends. There is a satisfaction in passing the torch. But this option isn&#x27;t as available anymore.</text></item></parent_chain><comment><author>eru</author><text>&gt; Where before you would have a few hand made item of high quality, you now get a lot of lower quality items.<p>Hand made items didn&#x27;t use to be of universal higher quality. Just the opposite.</text></comment> |
23,025,765 | 23,025,706 | 1 | 2 | 23,019,199 | train | <story><title>What other coronaviruses tell us about SARS-CoV-2</title><url>https://www.quantamagazine.org/what-can-other-coronaviruses-tell-us-about-sars-cov-2-20200429/</url></story><parent_chain><item><author>mfer</author><text>Strokes, kidney disease, and other problems are happening to people who get this. Looking at death rate alone doesn’t tell of all the consequences</text></item><item><author>hristov</author><text>I think everybody should carefully read the short section that says &quot;Does the amount of exposure affect the severity?&quot;.<p>Some people are looking at antibody studies that show that large parts of the population are somehow affected and then doing back of the napkin math and deciding that this is really not worse than the flu in terms of death rate. They are further encouraged by the fact that people that die tend to be older and with pre-existing conditions. Then they say we should open everything up and get everything &quot;back to normal&quot;, and sometimes they add that this is just a way to winnow out the old and the weak.<p>As problematic as the latter statement is, the above thinking is actually dangerously optimistic. Experience shows us that in specific situations with prolonged&#x2F;repeated exposure, the death rates are much higher. Also young and healthy people may get seriously sick and die due to high level of exposure.<p>This has already been happening in hospitals that did not have sufficient protective equipment&#x2F;procedures.<p>If we open everything up and get into a situation where the virus is all over the place we can get into a horrible spiral of increased exposure and increased death rates etc.</text></item></parent_chain><comment><author>nyhc99</author><text>I read that there was visible lung tissue damage forming the characteristic broken glass pattern on CT scans of 58% of the infected but asymptomatic carriers on the Diamond Princess cruise. That to me adds a sobering footnote to the process of developing herd immunity. Most of us won&#x27;t die, but do we walk away with long-term impairment of our lungs and possibly other organs?</text></comment> | <story><title>What other coronaviruses tell us about SARS-CoV-2</title><url>https://www.quantamagazine.org/what-can-other-coronaviruses-tell-us-about-sars-cov-2-20200429/</url></story><parent_chain><item><author>mfer</author><text>Strokes, kidney disease, and other problems are happening to people who get this. Looking at death rate alone doesn’t tell of all the consequences</text></item><item><author>hristov</author><text>I think everybody should carefully read the short section that says &quot;Does the amount of exposure affect the severity?&quot;.<p>Some people are looking at antibody studies that show that large parts of the population are somehow affected and then doing back of the napkin math and deciding that this is really not worse than the flu in terms of death rate. They are further encouraged by the fact that people that die tend to be older and with pre-existing conditions. Then they say we should open everything up and get everything &quot;back to normal&quot;, and sometimes they add that this is just a way to winnow out the old and the weak.<p>As problematic as the latter statement is, the above thinking is actually dangerously optimistic. Experience shows us that in specific situations with prolonged&#x2F;repeated exposure, the death rates are much higher. Also young and healthy people may get seriously sick and die due to high level of exposure.<p>This has already been happening in hospitals that did not have sufficient protective equipment&#x2F;procedures.<p>If we open everything up and get into a situation where the virus is all over the place we can get into a horrible spiral of increased exposure and increased death rates etc.</text></item></parent_chain><comment><author>nradov</author><text>And how many strokes, kidney disease, and other problems were caused by other respiratory system viruses before COVID-19? We don&#x27;t know because we&#x27;ve never really looked.</text></comment> |
17,403,094 | 17,401,837 | 1 | 3 | 17,396,444 | train | <story><title>Toward a Mature Science of Consciousness</title><url>https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5986937/</url></story><parent_chain><item><author>incadenza</author><text>I&#x27;ve always felt that Dennett has side stepped the question here. The &#x27;hard problem&#x27; here is to explain how a set of physical processes give rise to consciousness or sensory experience at all. In other words, why the lights are on.<p>As far as I can tell, consciousness is literally the one thing in the universe that can&#x27;t be an illusion. Even if, in the extreme case, we&#x27;re brains in a vat, etc.</text></item><item><author>lisper</author><text>I found that Dan Dennett&#x27;s &quot;Consciousness Explained&quot; really resonated with me. His thesis is (as best I can condense a whole book to fit in an HN comment) that consciousness is essentially an illusion produced by the brain as the best model it can construct of the sensory input it receives. But the model actually lags behind the present, and its history can be rewritten when new information comes in. Dennett describes it much better than I do, and provides a lot of experimental evidence. If nothing else, the experiments he describes will make you see your own consciousness in a new light. I recommend reading the book.</text></item></parent_chain><comment><author>jbattle</author><text>&gt; As far as I can tell, consciousness is literally the one thing in the universe that can&#x27;t be an illusion. Even if, in the extreme case, we&#x27;re brains in a vat, etc.<p>Cogito ergo sum!<p>&quot;I have convinced myself that there is absolutely nothing in the world, no sky, no earth, no minds, no bodies. Does it now follow that I too do not exist? No: if I convinced myself of something then I certainly existed. But there is a deceiver of supreme power and cunning who is deliberately and constantly deceiving me. In that case I too undoubtedly exist, if he is deceiving me; and let him deceive me as much as he can, he will never bring it about that I am nothing so long as I think that I am something. So after considering everything very thoroughly, I must finally conclude that this proposition, I am, I exist, is necessarily true whenever it is put forward by me or conceived in my mind.&quot;</text></comment> | <story><title>Toward a Mature Science of Consciousness</title><url>https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5986937/</url></story><parent_chain><item><author>incadenza</author><text>I&#x27;ve always felt that Dennett has side stepped the question here. The &#x27;hard problem&#x27; here is to explain how a set of physical processes give rise to consciousness or sensory experience at all. In other words, why the lights are on.<p>As far as I can tell, consciousness is literally the one thing in the universe that can&#x27;t be an illusion. Even if, in the extreme case, we&#x27;re brains in a vat, etc.</text></item><item><author>lisper</author><text>I found that Dan Dennett&#x27;s &quot;Consciousness Explained&quot; really resonated with me. His thesis is (as best I can condense a whole book to fit in an HN comment) that consciousness is essentially an illusion produced by the brain as the best model it can construct of the sensory input it receives. But the model actually lags behind the present, and its history can be rewritten when new information comes in. Dennett describes it much better than I do, and provides a lot of experimental evidence. If nothing else, the experiments he describes will make you see your own consciousness in a new light. I recommend reading the book.</text></item></parent_chain><comment><author>lisper</author><text>Like I said, trying to condense it down to an HN comment imposes some very severe restrictions. The word &quot;illusion&quot; is not really quite right. A better description might be (and these are my words, not Dennett&#x27;s) that consciousness is in its own ontological category [1].<p>[1] <a href="http:&#x2F;&#x2F;blog.rongarret.info&#x2F;2015&#x2F;02&#x2F;31-flavors-of-ontology.html" rel="nofollow">http:&#x2F;&#x2F;blog.rongarret.info&#x2F;2015&#x2F;02&#x2F;31-flavors-of-ontology.ht...</a></text></comment> |
39,645,464 | 39,644,473 | 1 | 3 | 39,641,322 | train | <story><title>Build Initramfs Rootless</title><url>https://blog.izissise.net/posts/initramfs/</url></story><parent_chain></parent_chain><comment><author>bradfa</author><text>If you want devtmpfs mounted automatically by the kernel within an initramfs, then you may need to patch your kernel slightly, like: <a href="https:&#x2F;&#x2F;lore.kernel.org&#x2F;lkml&#x2F;[email protected]&#x2F;" rel="nofollow">https:&#x2F;&#x2F;lore.kernel.org&#x2F;lkml&#x2F;25e7e777-19f9-6280-b456-6c9c782...</a><p>By default the kernel will not mount devtmpfs automatically within an initramfs, even if you&#x27;ve configured your kernel to automatically mount devtmpfs, because it waits until the real rootfs has been mounted to do it. This is fine unless you never plan to actually mount a real rootfs as you&#x27;ve made your initramfs have everything a normal rootfs would have and you want devtmpfs to Just Work.</text></comment> | <story><title>Build Initramfs Rootless</title><url>https://blog.izissise.net/posts/initramfs/</url></story><parent_chain></parent_chain><comment><author>lrvick</author><text>Here is a tiny containerized and deterministic Linux build process with a minimal init system in rust for secure enclaves:<p><a href="https:&#x2F;&#x2F;git.distrust.co&#x2F;public&#x2F;enclaveos&#x2F;src&#x2F;branch&#x2F;master&#x2F;Containerfile" rel="nofollow">https:&#x2F;&#x2F;git.distrust.co&#x2F;public&#x2F;enclaveos&#x2F;src&#x2F;branch&#x2F;master&#x2F;C...</a><p>And another one for a simple busybox based ISO for airgap operations:<p><a href="https:&#x2F;&#x2F;git.distrust.co&#x2F;public&#x2F;airgap&#x2F;src&#x2F;branch&#x2F;stagex-rewrite&#x2F;Containerfile" rel="nofollow">https:&#x2F;&#x2F;git.distrust.co&#x2F;public&#x2F;airgap&#x2F;src&#x2F;branch&#x2F;stagex-rewr...</a><p>Making Linux images for server&#x2F;appliance use cases is always more secure and almost always easier than trying to adapt workstation distros like debian. Lots of ways to do it depending on your tooling preferences.</text></comment> |
22,787,220 | 22,786,594 | 1 | 2 | 22,786,438 | train | <story><title>Layoffs.fyi Coronavirus Tracker</title><url>https://layoffs.fyi/tracker/</url></story><parent_chain></parent_chain><comment><author>rjtobin</author><text>Not a lawyer, but to any fellow H1B&#x27;s who are concerned about being laid off, and facing the near-impossible task of leaving the US on short notice during a worldwide lockdown: a 2017 law provides a 60-day grace period for &quot;those whose employment ceases prior to the end of the petition validity period&quot;. Ie. if there is time left on your I-94, you should have 60-day grace period where you will remain in-status while you look for another job.<p>Here is the full document: <a href="https:&#x2F;&#x2F;www.govinfo.gov&#x2F;content&#x2F;pkg&#x2F;FR-2016-11-18&#x2F;pdf&#x2F;2016-27540.pdf" rel="nofollow">https:&#x2F;&#x2F;www.govinfo.gov&#x2F;content&#x2F;pkg&#x2F;FR-2016-11-18&#x2F;pdf&#x2F;2016-2...</a> (search for &quot;60-day nonimmigrant&quot;).<p>Note that during these 60-days you can then file for a change of status, and then you will still remain in-status while that request is pending (the change you file must be &quot;non-frivolous&quot; though!).<p>If you follow this route and try to change status, USCIS has the right to revoke the grace period retroactively (in cases of abuse, for example). Documentation that you were actively seeking new employment should help avoid this.<p>Anyway, concerned people should certainly speak to an immigration attorney (again, I am <i>not</i> a lawyer), but it is comforting to know that most likely we have 60 days to get our affairs in order...</text></comment> | <story><title>Layoffs.fyi Coronavirus Tracker</title><url>https://layoffs.fyi/tracker/</url></story><parent_chain></parent_chain><comment><author>danso</author><text>Another great example of how taking the time to track things and fill out a spreadsheet is itself a good useful service. My main suggestion is to emphasize in the headline&#x2F;title that this is focused on tech startup layoffs. I know it&#x27;s alluded to elsewhere on the site text, it&#x27;s just that many people who see &quot;layoffs.fyi&quot; will think it&#x27;s a general layoffs tracker.<p>And of course it&#x27;s fine just to focus on tech startups – tracking all company layoffs (nevermind local businesses) would be a huge undertaking.</text></comment> |
19,986,807 | 19,986,805 | 1 | 2 | 19,986,106 | train | <story><title>Playdate – A New Handheld Gaming System</title><url>https://play.date/</url></story><parent_chain><item><author>bullfightonmars</author><text>It would be so awesome if this worked with Pico 8 or a similar fantasy console. I will be much more interested if the barrier to entry for making my own games is as close to zero as possible.<p>I want to make my own games for fun, but also to expose my son to the creativity and exploration of programming.<p>A physical console would make this experience so much more real.<p><a href="https:&#x2F;&#x2F;www.lexaloffle.com&#x2F;pico-8.php" rel="nofollow">https:&#x2F;&#x2F;www.lexaloffle.com&#x2F;pico-8.php</a></text></item></parent_chain><comment><author>darzu</author><text>Have you seen <a href="https:&#x2F;&#x2F;arcade.makecode.com" rel="nofollow">https:&#x2F;&#x2F;arcade.makecode.com</a> ?
It&#x27;s a free, open source, web-based editor for making games, and you can download games to a number of hardware boards, or play on your phone or any web browser. The cheapest hardware right now is $25 from Adafruit [0], but more hardware is coming out all the time.<p>Also this audience might be interested to know that soon there&#x27;ll be Python support in MakeCode. More details on our language toolchain here [1]<p>Disclosure: I work for MakeCode :)<p>[0] <a href="https:&#x2F;&#x2F;www.adafruit.com&#x2F;product&#x2F;3939" rel="nofollow">https:&#x2F;&#x2F;www.adafruit.com&#x2F;product&#x2F;3939</a>
[1] <a href="https:&#x2F;&#x2F;makecode.com&#x2F;language" rel="nofollow">https:&#x2F;&#x2F;makecode.com&#x2F;language</a></text></comment> | <story><title>Playdate – A New Handheld Gaming System</title><url>https://play.date/</url></story><parent_chain><item><author>bullfightonmars</author><text>It would be so awesome if this worked with Pico 8 or a similar fantasy console. I will be much more interested if the barrier to entry for making my own games is as close to zero as possible.<p>I want to make my own games for fun, but also to expose my son to the creativity and exploration of programming.<p>A physical console would make this experience so much more real.<p><a href="https:&#x2F;&#x2F;www.lexaloffle.com&#x2F;pico-8.php" rel="nofollow">https:&#x2F;&#x2F;www.lexaloffle.com&#x2F;pico-8.php</a></text></item></parent_chain><comment><author>RodgerTheGreat</author><text>Go on ebay and dig up a PocketCHIP; it&#x27;s as close to a first-class mass-produced physical Pico-8 as anyone is likely to see.</text></comment> |
40,243,280 | 40,243,343 | 1 | 3 | 40,240,737 | train | <story><title>Got an old Raspberry Pi spare? Try RISC OS. It is, something else</title><url>https://www.theregister.com/2024/05/02/rool_530_is_here/</url></story><parent_chain><item><author>ajb</author><text>It was ahead of its time in UX, but rather behind in the foundations. It&#x27;s a single user system with no real security, and there was no system of shared libraries - to share code between applications, it was usual to put the shared code in a kernel module and call the kernel. Even the standard C library worked this way.<p>Amusingly, when you invoked the system console -which was at a lower level than the gui system, effectively pausing it - the command line appeared at the bottom of the screen and the frozen gui scrolled up as you entered more commands; until you exit the system console. (It was also possible to get a command line in a window, which could do slightly less - I forget exactly what)</text></item></parent_chain><comment><author>qwerty456127</author><text>&gt; rather behind in the foundations. It&#x27;s a single user system with no real security<p>I believe multi-user systems are actually an ancient, outdated rather than a &quot;modern&quot; concept. It made sense when computers were huge, expensive and many users shared one even at work, let alone at home. Nowadays computers are almost never shared. Even when people used to have just one home PC per family (during pre-Win7 days) they mostly preferred to disable the sign-in screen and share the whole environment.<p>Nowadays multi-user OS facilities definitely help to build security but they were not designed just for this. Modern security can be done better without an OS-level concept of a user.</text></comment> | <story><title>Got an old Raspberry Pi spare? Try RISC OS. It is, something else</title><url>https://www.theregister.com/2024/05/02/rool_530_is_here/</url></story><parent_chain><item><author>ajb</author><text>It was ahead of its time in UX, but rather behind in the foundations. It&#x27;s a single user system with no real security, and there was no system of shared libraries - to share code between applications, it was usual to put the shared code in a kernel module and call the kernel. Even the standard C library worked this way.<p>Amusingly, when you invoked the system console -which was at a lower level than the gui system, effectively pausing it - the command line appeared at the bottom of the screen and the frozen gui scrolled up as you entered more commands; until you exit the system console. (It was also possible to get a command line in a window, which could do slightly less - I forget exactly what)</text></item></parent_chain><comment><author>dajtxx</author><text>Home computers didn&#x27;t need multi-user capability or much in the way of security (other than anti-virus) back then. I&#x27;d argue they still don&#x27;t. I don&#x27;t think these two things were the problem.<p>I can take or leave shared libraries. They seem to cause a lot of trouble, but so do statics, so I&#x27;m on the fence there. But in the context of when this was released it&#x27;s a non-issue.<p>I&#x27;ll give you the CLI thing though. If the CLI couldn&#x27;t be full-featured in a window that was an oversight.</text></comment> |
24,071,649 | 24,071,507 | 1 | 3 | 24,070,531 | train | <story><title>Pulse oximeters give biased results for people with darker skin</title><url>https://bostonreview.net/science-nature-race/amy-moran-thomas-how-popular-medical-device-encodes-racial-bias</url></story><parent_chain><item><author>treis</author><text>This really isn&#x27;t a big deal. Doctors know that these pulse ox meters aren&#x27;t very accurate and use them accordingly. Nobody&#x27;s changing their plan because you have a pulse ox of 93 instead of 95. They&#x27;re there to quickly measure big changes that indicate a problem. In other words, they&#x27;re looking for patients going from 98 to 75, not 98 to 95.</text></item></parent_chain><comment><author>tashi</author><text>My insurance company didn&#x27;t authorize a full sleep study because my overnight pulse oximeter results were normal, so it took me an extra couple of years to find out I needed a CPAP machine. It might have helped to know that my skin color could cause inaccurate results.</text></comment> | <story><title>Pulse oximeters give biased results for people with darker skin</title><url>https://bostonreview.net/science-nature-race/amy-moran-thomas-how-popular-medical-device-encodes-racial-bias</url></story><parent_chain><item><author>treis</author><text>This really isn&#x27;t a big deal. Doctors know that these pulse ox meters aren&#x27;t very accurate and use them accordingly. Nobody&#x27;s changing their plan because you have a pulse ox of 93 instead of 95. They&#x27;re there to quickly measure big changes that indicate a problem. In other words, they&#x27;re looking for patients going from 98 to 75, not 98 to 95.</text></item></parent_chain><comment><author>in_cahoots</author><text>That hasn’t been true at all in my experience. When my son got pneumonia, we were told he would be hospitalized at 92 when he was reading 94. He’s dark-skinned so who knows what the actual reading should have been.</text></comment> |
5,240,283 | 5,240,161 | 1 | 3 | 5,239,588 | train | <story><title>Wikipedia processing. PyPy vs CPython benchmark</title><url>http://rz.scale-it.pl/2013/02/18/wikipedia_processing._PyPy_vs_CPython_benchmark.html</url><text>What PyPy can do for Wikipedia processing tasks? Speedup!</text></story><parent_chain></parent_chain><comment><author>JulianWasTaken</author><text>Can I tangentially point out without much connection to this benchmark more than any of the others recently, that one of the great things about PyPy is that if you have a thing and you run it on PyPy, you can usually pop in the IRC channel and often get even <i>more</i> tips on how to tune it to be even faster?<p>There are the simple tips like "write everything in Python where possible, don't use C extensions" like the OP noticed, but even after you've made the decision on using PyPy there are often specific performance characteristics of the PyPy implementation that can be really helpful to keep in mind, and it's a great resource to try and take advantage of (human interaction with PyPy developers like fijal who care about making things fast).</text></comment> | <story><title>Wikipedia processing. PyPy vs CPython benchmark</title><url>http://rz.scale-it.pl/2013/02/18/wikipedia_processing._PyPy_vs_CPython_benchmark.html</url><text>What PyPy can do for Wikipedia processing tasks? Speedup!</text></story><parent_chain></parent_chain><comment><author>tworats</author><text>Great to see real world use cases, and very encouraging to see PyPy performing so well. I'll definitely be trying it on my future compute-intensive projects.</text></comment> |
38,897,693 | 38,897,797 | 1 | 3 | 38,897,475 | train | <story><title>US unemployment has been under 4% for the longest streak since the Vietnam War</title><url>https://www.npr.org/2024/01/05/1222714145/jobs-report-december-labor-wages</url></story><parent_chain><item><author>bhpm</author><text>The largest generation in American history is retiring.</text></item></parent_chain><comment><author>toomuchtodo</author><text>10k per day, 3.65M per year.</text></comment> | <story><title>US unemployment has been under 4% for the longest streak since the Vietnam War</title><url>https://www.npr.org/2024/01/05/1222714145/jobs-report-december-labor-wages</url></story><parent_chain><item><author>bhpm</author><text>The largest generation in American history is retiring.</text></item></parent_chain><comment><author>throwaway5959</author><text>Exactly. People act surprised when labor force participation has declined since 2008, at least until they’re told when the first baby boomer hit retirement age.</text></comment> |
6,726,785 | 6,726,221 | 1 | 2 | 6,725,291 | train | <story><title>Motorola Makes The Moto G Official, A “Premium” Phone Starting At $179 Unlocked</title><url>http://techcrunch.com/2013/11/13/motorola-makes-the-moto-g-official-a-premium-phone-at-a-price-more-can-afford/</url></story><parent_chain></parent_chain><comment><author>cs702</author><text>This phone looks like a game changer to me, because it has the specs of a <i>high-end</i> smartphone but is priced like a crappy <i>low-end</i> one.<p>The cost of an unlocked unit is $300 to $600 LESS than that of other devices with comparable specs, so mobile carriers should be able to offer the Moto G to the masses for <i>hundreds of dollars less</i> than any iPhone or high-end Android device by Samsung, LG, etc.<p>Mobile carriers could offer the Moto G profitably at a <i>negative price</i> -- for example, zero money upfront plus an instant $300 coupon rebate if one commits to a two-year plan. Or they could offer it with much cheaper monthly bills than economically possible with other comparable phones -- for example, 25% off one&#x27;s monthly bill if one commits to a two-year plan.<p>--<p>Edits: added context and examples.</text></comment> | <story><title>Motorola Makes The Moto G Official, A “Premium” Phone Starting At $179 Unlocked</title><url>http://techcrunch.com/2013/11/13/motorola-makes-the-moto-g-official-a-premium-phone-at-a-price-more-can-afford/</url></story><parent_chain></parent_chain><comment><author>Derbasti</author><text>This is terriffic news! Smartphones are things that drop, that break, that fall into water and that get lost. Shelling out $500 for something this... ephemeral... always seemed absurd to me.<p>Or maybe I am just clumsy.</text></comment> |
12,927,597 | 12,927,363 | 1 | 3 | 12,926,678 | train | <story><title>Peter Thiel To Join Trump Transition Team</title><url>https://www.linkedin.com/pulse/peter-thiel-join-trump-transition-team-dan-primack</url></story><parent_chain><item><author>jbpetersen</author><text>Any specific highlights?</text></item><item><author>kafkaesq</author><text><i>Thiel has some extremish views, but I generally regard him as intelligent and thoughtful.</i><p>&quot;He seems intelligent and thoughtful, but has scarily extremist views&quot; might be a better angle to take.</text></item><item><author>darawk</author><text>This is good news, I think? Thiel has some extremish views, but I generally regard him as intelligent and thoughtful. And he&#x27;s in close contact with other smart, influential people (e.g. Musk, Sam Altman, etc.), which gives them some indirect influence.<p>Not that this makes things substantially better. But maybe a little bit? Hopefully?</text></item></parent_chain><comment><author>kafkaesq</author><text>In addition to what others have said:<p>His head-over-heels support for a decidedly anti-libertarian candidate like Trump is in itself quite extremist. This has always been one of the standard tools for (crypto-)fascists and closet authoritarians of all ilks to get into positions of influence -- by using people of nominally &quot;opposing&quot; viewpoints as, in effect, wedge instruments to disrupt the system as violently as possible -- and then step into the emergent vacuum.<p>And somewhat secondarily, there are the antics of his pals at the Seasteading Institute in blatantly courting partnership with the government in Honduras (basically the most murderous government left in the hemisphere) in order to set up a libertarian microstate in the heart of certain piece of &quot;jungle&quot; that just so happens to also be claimed by one of the region&#x27;s poorest and most ill-treated ethnic groups.<p>Thiel&#x27;s own role in this venture (if any) is unclear -- but his association with the SSI (and the principle actors in this &quot;venture&quot;) is well-established.</text></comment> | <story><title>Peter Thiel To Join Trump Transition Team</title><url>https://www.linkedin.com/pulse/peter-thiel-join-trump-transition-team-dan-primack</url></story><parent_chain><item><author>jbpetersen</author><text>Any specific highlights?</text></item><item><author>kafkaesq</author><text><i>Thiel has some extremish views, but I generally regard him as intelligent and thoughtful.</i><p>&quot;He seems intelligent and thoughtful, but has scarily extremist views&quot; might be a better angle to take.</text></item><item><author>darawk</author><text>This is good news, I think? Thiel has some extremish views, but I generally regard him as intelligent and thoughtful. And he&#x27;s in close contact with other smart, influential people (e.g. Musk, Sam Altman, etc.), which gives them some indirect influence.<p>Not that this makes things substantially better. But maybe a little bit? Hopefully?</text></item></parent_chain><comment><author>randycupertino</author><text>He doesn&#x27;t believe in global warming, for one.</text></comment> |
6,339,824 | 6,339,328 | 1 | 3 | 6,338,899 | train | <story><title>All LinkedIn with Nowhere to Go</title><url>https://www.thebaffler.com/past/all_linkedin_with_nowhere_to_go</url></story><parent_chain></parent_chain><comment><author>shubb</author><text>Linkedin, to me, is the most honest social network.<p>Social networks, Facebook or Google Plus, exist to create a base of knowledge about people, and then sell use of it, usually to advertisers.<p>Users don&#x27;t use Facebook in order to be advertised to. They use it to talk to friends, store photos, or possibly to create a low effort website for a business. Yet their value to Facebook is as a target for advertising, and maybe one day their information will be used to provide metrics for insurers and employers.<p>The highest cost keywords on Google are for payday loans and insurance. In both cases, the money in the industry is in making users buy a product that costs more than it should - if you make money selling payday loans, you are making users with a lower risk profile than the interest reflects buy your loans, rather than go to the bank.<p>Linkedin users know they are creating a public face that employers, employees, and business partners can use to evaluate them. Linkedin sells access to this data, for the purpose that the users put it up there.<p>No one posts drunk photos on Linkedin, and if they are complaining about &#x27;spam&#x27; from recruiters, they are clearly not looking for a job at the moment (which is what linkedin is for).<p>Google and Facebook are private surveillance systems that bribe users using unrelated functionality, to document themselves for purposes that are against their interests (making them buy things they otherwise wouldn&#x27;t, etc). Linkedin provides a publishing platform for users to document themselves and broadcast that documentation. Which one is less evil?</text></comment> | <story><title>All LinkedIn with Nowhere to Go</title><url>https://www.thebaffler.com/past/all_linkedin_with_nowhere_to_go</url></story><parent_chain></parent_chain><comment><author>praptak</author><text>To me LinkedIn is useful as a self-updating list of contacts. I actually got a job this way: spotted a decent low-buzzword job ad, banged the company name into LI, a friend popped up who I didn&#x27;t know worked there.<p>I got the job, he got the referral bonus. Double win. Maybe we&#x27;d still get it without LinkedIn but one sure thing is that LI made it very fast and convenient.</text></comment> |
11,505,933 | 11,505,905 | 1 | 3 | 11,503,934 | train | <story><title>The Doom Movement Bible</title><url>https://www.doomworld.com/vb/post/1586811</url></story><parent_chain><item><author>zanny</author><text>Yet for some reason <i>extremely</i> few games have followed id&#x27;s example. Besides Duke 3d, what other games ever saw source release?</text></item><item><author>haddr</author><text>This is just amazing. And what is more suprising is that Doom 1 and 2 are still thriving after &gt;20 years. If you download zandronum you will learn that there are still people playing it online. of course those are some mods, but still very much resembling the original atmosphere and graphics of doom 1 and 2.<p>Edit: of course this probably would be different if the doom source code wasn&#x27;t released at some point of time...</text></item></parent_chain><comment><author>llasram</author><text>Wikipedia has a list: <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;List_of_commercial_video_games_with_available_source_code" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;List_of_commercial_video_games...</a></text></comment> | <story><title>The Doom Movement Bible</title><url>https://www.doomworld.com/vb/post/1586811</url></story><parent_chain><item><author>zanny</author><text>Yet for some reason <i>extremely</i> few games have followed id&#x27;s example. Besides Duke 3d, what other games ever saw source release?</text></item><item><author>haddr</author><text>This is just amazing. And what is more suprising is that Doom 1 and 2 are still thriving after &gt;20 years. If you download zandronum you will learn that there are still people playing it online. of course those are some mods, but still very much resembling the original atmosphere and graphics of doom 1 and 2.<p>Edit: of course this probably would be different if the doom source code wasn&#x27;t released at some point of time...</text></item></parent_chain><comment><author>mazatta</author><text>The only one off the top of my head is Serious Sam: <a href="https:&#x2F;&#x2F;github.com&#x2F;Croteam-official&#x2F;Serious-Engine" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;Croteam-official&#x2F;Serious-Engine</a></text></comment> |
25,493,098 | 25,492,830 | 1 | 3 | 25,489,879 | train | <story><title>More challenging projects every programmer should try</title><url>https://web.eecs.utk.edu/~azh/blog/morechallengingprojects.html</url></story><parent_chain><item><author>pjc50</author><text>I worked on a unique HFT system which was capable of starting the response packet before the incoming packet&#x27;s final byte had arrived. Negative latency. If it mispredicted the future, it would simply scramble the trailer and cause the packet to fail UDP checksum.<p>It didn&#x27;t make money in the real world because the quality of decision that could be made at that speed wasn&#x27;t good enough.</text></item><item><author>henning</author><text>As someone who has played with writing trading bots but never traded them with real money, some advice: if your results seem too good to be true, they probably are. Your trading bot may be doing unrealistic things or its results may not be reliable if the following are true:<p>- You are trading in a market with low liquidity or one that is controlled by a small number of market participants. I&#x27;m not an expert but I think this would apply more to markets like penny stocks and less to big markets like forex for major currency pairs<p>- You are not taking transaction costs into account or not doing so properly<p>- Your bot makes a low number of trades, making the results close or equivalent to lucky coin flips<p>- Your bot is simply making trades that cannot be executed, or may be doing simulated trades of something that is not actually tradable. This applies to a large number of research papers that assume you can just buy and trade the S&amp;P 500 itself. You can trade ETFs that are tied to an index but an index is not a tradable instrument in of itself. Once you realize this, a lot of papers seem very weird<p>- You are not modelling other aspects of the trading process realistically, such as assuming the bot has infinite funds to trade, allowing it to take unlimited losses and continue trading when in reality you&#x27;d be hit with a margin call and your trading would be stopped<p>- Your code is committing any number of data snooping errors where the bot is asked to trade at time A (say the open of a trading session) but has access to future data (say the closing price of that day, future data that would not actually exist in a live environment)<p>- Depending on what you believe about how market conditions change over time, your bot may have worked in the past but would not work if used today. I.e., the market may have adapted to whatever edge your bot may have discovered<p>There are probably lots more pitfalls I don&#x27;t even know about since I&#x27;m not an actual trader.<p>I&#x27;m not discouraging anyone from playing around or trying things, of course. I think it&#x27;s great fun, which is why I do it.<p>Here&#x27;s the good news: if you realize you don&#x27;t actually have an edge and avoid risking your hard-earned money, you come out ahead of almost all people who ever trade.</text></item></parent_chain><comment><author>djoldman</author><text>This was an open secret.
You can also send the start of a datagram on every incoming packet and then abort if you don&#x27;t want to add&#x2F;modify&#x2F;delete an order by splitting your datagram into multiple packets. This worked on some exchanges because they ordered add&#x2F;modify&#x2F;deletes by when the beginning of the packet was received as opposed to the end.<p>If you do too much of this you will get angry exchange network people yelling at you as they may consider it spam&#x2F;ddos. Some exchanges explicitly limit this.<p>It&#x27;s also worth considering the possible gain. If you&#x27;re in the fpga (0-150ns) or cpu(300ns-3us) space, the math can come out differently.</text></comment> | <story><title>More challenging projects every programmer should try</title><url>https://web.eecs.utk.edu/~azh/blog/morechallengingprojects.html</url></story><parent_chain><item><author>pjc50</author><text>I worked on a unique HFT system which was capable of starting the response packet before the incoming packet&#x27;s final byte had arrived. Negative latency. If it mispredicted the future, it would simply scramble the trailer and cause the packet to fail UDP checksum.<p>It didn&#x27;t make money in the real world because the quality of decision that could be made at that speed wasn&#x27;t good enough.</text></item><item><author>henning</author><text>As someone who has played with writing trading bots but never traded them with real money, some advice: if your results seem too good to be true, they probably are. Your trading bot may be doing unrealistic things or its results may not be reliable if the following are true:<p>- You are trading in a market with low liquidity or one that is controlled by a small number of market participants. I&#x27;m not an expert but I think this would apply more to markets like penny stocks and less to big markets like forex for major currency pairs<p>- You are not taking transaction costs into account or not doing so properly<p>- Your bot makes a low number of trades, making the results close or equivalent to lucky coin flips<p>- Your bot is simply making trades that cannot be executed, or may be doing simulated trades of something that is not actually tradable. This applies to a large number of research papers that assume you can just buy and trade the S&amp;P 500 itself. You can trade ETFs that are tied to an index but an index is not a tradable instrument in of itself. Once you realize this, a lot of papers seem very weird<p>- You are not modelling other aspects of the trading process realistically, such as assuming the bot has infinite funds to trade, allowing it to take unlimited losses and continue trading when in reality you&#x27;d be hit with a margin call and your trading would be stopped<p>- Your code is committing any number of data snooping errors where the bot is asked to trade at time A (say the open of a trading session) but has access to future data (say the closing price of that day, future data that would not actually exist in a live environment)<p>- Depending on what you believe about how market conditions change over time, your bot may have worked in the past but would not work if used today. I.e., the market may have adapted to whatever edge your bot may have discovered<p>There are probably lots more pitfalls I don&#x27;t even know about since I&#x27;m not an actual trader.<p>I&#x27;m not discouraging anyone from playing around or trying things, of course. I think it&#x27;s great fun, which is why I do it.<p>Here&#x27;s the good news: if you realize you don&#x27;t actually have an edge and avoid risking your hard-earned money, you come out ahead of almost all people who ever trade.</text></item></parent_chain><comment><author>inopinatus</author><text>From time to time there comes a remark that is the absolute epitome of Hacker News, and I mean that earnestly and without any backhanded rancour, and I thank you for this flawless specimen.</text></comment> |
10,426,504 | 10,425,108 | 1 | 2 | 10,424,850 | train | <story><title>BMW i8 in WebGL</title><url>http://car.playcanvas.com</url></story><parent_chain><item><author>jasonkester</author><text>To whoever it was a few weeks back who said it looks like the i8 was &quot;giving birth to a Porsche&quot;, you were right: Once you&#x27;ve seen it, you can&#x27;t unsee it.<p>As this visualization demonstrates in nice 3d form.</text></item></parent_chain><comment><author>uniclaude</author><text>To anyone looking why: <a href="http:&#x2F;&#x2F;img-9gag-fun.9cache.com&#x2F;photo&#x2F;aBKqR1D_700b.jpg" rel="nofollow">http:&#x2F;&#x2F;img-9gag-fun.9cache.com&#x2F;photo&#x2F;aBKqR1D_700b.jpg</a></text></comment> | <story><title>BMW i8 in WebGL</title><url>http://car.playcanvas.com</url></story><parent_chain><item><author>jasonkester</author><text>To whoever it was a few weeks back who said it looks like the i8 was &quot;giving birth to a Porsche&quot;, you were right: Once you&#x27;ve seen it, you can&#x27;t unsee it.<p>As this visualization demonstrates in nice 3d form.</text></item></parent_chain><comment><author>Jack000</author><text>I think the design has a few too many un-necessary flourishes, and it probably won&#x27;t age very well.<p>It definitely grabs your attention, but in a few years when electric cars are less rare I bet it will look kind of gimmicky.</text></comment> |
37,896,063 | 37,895,964 | 1 | 2 | 37,894,390 | train | <story><title>Why decline in generalists leads to disjointed games and harms tool quality</title><url>https://gameworldobserver.com/2023/10/09/generalists-vs-specialists-gamedev-tool-quality-tim-cain</url></story><parent_chain><item><author>nitwit005</author><text>The generalist&#x27;s resume looks worse. They have 2 years experience with (insert role here), compared to the other guy who has 8, because they&#x27;ve worked in multiple roles.<p>Some industries have realized they want some people who have multiple skills, such as the &quot;full stack engineer&quot; roles, but it seems rather more common to prefer someone who precisely fits into the specialist role they defined.</text></item></parent_chain><comment><author>vk6flab</author><text>I&#x27;ve been in the workforce for almost 40 years as an ICT professional. In my experience there&#x27;s lots of other things going on in addition to the point you make.<p>People flat-out don&#x27;t believe that the skills enumerated on my resume could be held by a single person, and are.<p>Apparently, I&#x27;m clearly &quot;over the hill&quot;, having been at this for nearly 40 years. I should state that I have 20+ years experience instead, as-if having life experience doesn&#x27;t actually make you better qualified to understand the contexts of any role in an organisation.<p>My skill-set is &quot;threatening&quot; to others, because, apparently, knowing things about a great many technologies means that I know more than the person hiring me, which somehow means that I&#x27;m a threat. As-if surrounding yourself by people who are smarter than you is suddenly undesirable.<p>I&#x27;ve been told to &quot;dumb down&quot; my resume because there&#x27;s too much there, even if there&#x27;s 99% chance that a computer will be the device actually reading it and selecting me. Apparently I&#x27;m supposed to &quot;infiltrate&quot; a company at a lower level and then &quot;show&quot; them that I can do more, despite 40 years experience showing that people only see the skills that they want to see.<p>According to some automated job technologies I have 104 years of experience. Not sure how that works.<p>Then there are those who want to know which of the experience is &quot;professional&quot; and which isn&#x27;t. (I&#x27;ve been self-employed for the past 23+ years)<p>In the end, I liken it to this:<p>&quot;How much experience do you have in eating food?&quot;<p>&quot;Do you mean, adding together each of the 15 minute meals, or how long have I been eating food for since I was born?&quot;<p>&quot;Okay, how much professional experience do you have eating food?&quot;<p>&quot;...&quot;</text></comment> | <story><title>Why decline in generalists leads to disjointed games and harms tool quality</title><url>https://gameworldobserver.com/2023/10/09/generalists-vs-specialists-gamedev-tool-quality-tim-cain</url></story><parent_chain><item><author>nitwit005</author><text>The generalist&#x27;s resume looks worse. They have 2 years experience with (insert role here), compared to the other guy who has 8, because they&#x27;ve worked in multiple roles.<p>Some industries have realized they want some people who have multiple skills, such as the &quot;full stack engineer&quot; roles, but it seems rather more common to prefer someone who precisely fits into the specialist role they defined.</text></item></parent_chain><comment><author>Justsignedup</author><text>Hell, I experience this a lot. My experience is doing a lot of things, and being a force multiplier. However you go to a larger company, and suddenly they&#x27;re looking for very specific experience doing specific scaling that I never got to do in all my roles, boom, it is terrible for my resume. And many young people see this and just decide that let&#x27;s just focus on that, and make a great resume, rather than try to be useful.<p>I could have come into a job and say &quot;okay, I know back end, let me focus on that&quot; but instead I spent a day installing a shitty data pipeline that more or less took us through the next 4 years of data analytical needs for almost no cost. Etc.<p>It is just how it is. I even read a post on HN about a decade ago about how a guy was hire #1, was instrumental in the growth of a company, and when it was purchased was pushed out because they didn&#x27;t need a generalist anymore, they needed a specialist. And so he decided to drop the generalist role and focus entirely on one technical aspect to the benefit of his career.<p>So while I agree with Tim, I also have to say, as someone in the thick of it, there&#x27;s no good way out. Unless you want to keep working for Seed &#x2F; Series A startups, fingers crossed they sell or you won&#x27;t make that okay payoff.</text></comment> |
7,372,297 | 7,372,061 | 1 | 3 | 7,371,725 | train | <story><title>Sony and Panasonic announce the Archival Disc format</title><url>http://www.sony.net/SonyInfo/News/Press/201403/14-0310E/index.html</url></story><parent_chain></parent_chain><comment><author>dsr_</author><text>Important information that will drastically affect actual usage not provided:
- expected and guaranteed lifetime of discs
- minimum undamaged read&#x2F;write speeds
- recoverable-error read speeds
- bit error rate for writing and reading<p>Let&#x27;s take a typical small business system&#x27;s requirements. We have a 2U database with 12 3.5&quot; hot-swap disks, 2 200GB SSDs for caching and 10 3TB disks in a RAID10. We have up to 15TB that we want to archive.<p>Optimism: we get 100MB&#x2F;s write speed and a write BER of 10^-15. We swap 30 500GB disks, each taking about an hour and a half to write, and about one in a thousand disks has an unrecoverable error.<p>More likely: we get 75MB&#x2F;s sustained write speed and a write BER of 10^-14. We swap 30 500GB disks, each taking close to two hours, and we can expect one in three complete sets to have an unrecoverable error.<p>Press releases: not quite useful.</text></comment> | <story><title>Sony and Panasonic announce the Archival Disc format</title><url>http://www.sony.net/SonyInfo/News/Press/201403/14-0310E/index.html</url></story><parent_chain></parent_chain><comment><author>Pitarou</author><text>This makes perfect sense.<p>Sony and Panasonic need to do <i>something</i> with their lead in optical disk technology, but there&#x27;s no demand for a Blu-ray successor. Heck, there isn&#x27;t even much demand for Blu-ray. So archiving is the way to go.<p>I just hope they make those things to last.</text></comment> |
3,985,807 | 3,985,503 | 1 | 3 | 3,985,262 | train | <story><title>Pinterest Raises $100 MM at $1.5Bn Valuation</title><url>http://allthingsd.com/20120516/exclusive-japans-rakuten-wins-the-heart-of-pinterest-founder-in-funding-race/?mod=tweet</url></story><parent_chain><item><author>cletus</author><text>Well I'm going to head off the predictable and boring "bubble" comments and say this:<p>Pinterest has real value and a clear path to monetization through affiliate and/or advertising revenue. It's really a stupidly simple idea (essentially scrapbooking on the Web) executed incredibly well.<p>Is it worth $1.5B? I don't know. I would say it's definitely worth more than Instagram for whatever that's worth (not a lot). I guess we'd need stats on number of active users, engagement and revenue to make that determination--something we're not likely to get.<p>The only concerning point to me is that that it's foreign money, only because foreign money seems to be less discerning, at least based on DST and similar investments.<p>Anyway, congrats to the team. They've done exceptionally well.</text></item></parent_chain><comment><author>stroboskop</author><text><i>Well I'm going to head off the predictable and boring "bubble" comments and say this</i><p>Saying that predictably gets you upvotes for sure. But it's not about "bubble" comments being "boring" or not. The question is whether bubble claims have any substance. If they have, the bubble won't simply go away just because some people wish it would.<p><i>Is it worth $1.5B? I don't know. I would say it's definitely worth more than</i>...<p>Comparisons like this are inflationary. It seems you really don't care about bubbles. But a bubble would affect many people here at HN.<p>It doesn't matter whether bubble comments are boring or not, it's about the importance of a bubble.</text></comment> | <story><title>Pinterest Raises $100 MM at $1.5Bn Valuation</title><url>http://allthingsd.com/20120516/exclusive-japans-rakuten-wins-the-heart-of-pinterest-founder-in-funding-race/?mod=tweet</url></story><parent_chain><item><author>cletus</author><text>Well I'm going to head off the predictable and boring "bubble" comments and say this:<p>Pinterest has real value and a clear path to monetization through affiliate and/or advertising revenue. It's really a stupidly simple idea (essentially scrapbooking on the Web) executed incredibly well.<p>Is it worth $1.5B? I don't know. I would say it's definitely worth more than Instagram for whatever that's worth (not a lot). I guess we'd need stats on number of active users, engagement and revenue to make that determination--something we're not likely to get.<p>The only concerning point to me is that that it's foreign money, only because foreign money seems to be less discerning, at least based on DST and similar investments.<p>Anyway, congrats to the team. They've done exceptionally well.</text></item></parent_chain><comment><author>Alex3917</author><text>"Essentially scrapbooking on the Web"<p>A better analogy is probably delicious for women, at least in terms of having a mental framework to estimate the value.</text></comment> |
32,814,704 | 32,814,193 | 1 | 3 | 32,812,328 | train | <story><title>Bikes, not self driving cars, are the technological gateway to urban progress</title><url>https://nextcity.org/urbanist-news/bikes-not-self-driving-cars-are-the-technological-gateway-to-progress</url></story><parent_chain><item><author>Eji1700</author><text>I do think this is a more complex problem than people give credit.<p>I&#x27;ve seen bike lanes added to roads, and almost no one is happy. There&#x27;s always idiot drivers and bikers and every now and then someone gets hurt or killed.<p>Bike paths make infinitely more sense (especially since they dont have to follow roads and can take more direct paths), but cities seem loathe to adopt them vs just painting some lines on a current street.<p>Finally climate plays a huge part as well. I live where it gets over 100f in the suummer routinely. Even a 3 mile bike ride at that point means you&#x27;re drenched in sweat, which is just not acceptable in a majority of environments. Showers can be added but water is already a resource we&#x27;re flippant with when we really shouldn&#x27;t be.<p>Finally its not a great solution for the elderly and has some risks. Yes they can ride an e bike, but when you screw up at 20 mph in a sedan you wind up with a very expensive bill and an insurance premium hikes. When you crash on a bike, even with saftey gear, you can wind up pretty seriously injured.<p>None of this isnt to say we should not build more bike paths&#x2F;trains&#x2F;subways&#x2F;busses, but I dont think its a one solution fits all sort of thing.</text></item><item><author>LucasBrandt</author><text>Riding a bike doesn&#x27;t need to be the only or primary way that people get around in every city, all the time, for it to drastically improve quality of life for people living in cities the world over. Most people are capable of riding a bike - especially an e-bike - to get to where they&#x27;re going. It&#x27;s cheaper than a car, it can be faster than driving a car, and it is obviously better for both the global climate and the local environment.<p>More than half of all daily trips in the US are less than 3 miles[1]. If cities give people the option to ride a bike for those shorter trips without feeling unsafe, a lot of people will ride bikes to complete those trips. That&#x27;s shown over and over again. Change is hard, but a future with less cars and more trips by bike and public transportation is better, and possible.<p>[1] <a href="https:&#x2F;&#x2F;www.energy.gov&#x2F;eere&#x2F;vehicles&#x2F;articles&#x2F;fotw-1230-march-21-2022-more-half-all-daily-trips-were-less-three-miles-2021" rel="nofollow">https:&#x2F;&#x2F;www.energy.gov&#x2F;eere&#x2F;vehicles&#x2F;articles&#x2F;fotw-1230-marc...</a></text></item></parent_chain><comment><author>atchoo</author><text>&gt; Even a 3 mile bike ride at that point means you&#x27;re drenched in sweat<p>I can&#x27;t say that is my experience. Cycling at a relaxed speed is less effort than walking and you get a nice a breeze. I&#x27;ve cycled in 40C heat and it&#x27;s lovely - more comfortable than being stationary.<p>It&#x27;s hills (no matter the temperature) that make me break a sweat because of the exertion and a hardwired instinct to sprint up them.<p>&gt; Finally its not a great solution for the elderly<p>Trikes are a nice solution for those feeling a bit wobbly on two wheels.</text></comment> | <story><title>Bikes, not self driving cars, are the technological gateway to urban progress</title><url>https://nextcity.org/urbanist-news/bikes-not-self-driving-cars-are-the-technological-gateway-to-progress</url></story><parent_chain><item><author>Eji1700</author><text>I do think this is a more complex problem than people give credit.<p>I&#x27;ve seen bike lanes added to roads, and almost no one is happy. There&#x27;s always idiot drivers and bikers and every now and then someone gets hurt or killed.<p>Bike paths make infinitely more sense (especially since they dont have to follow roads and can take more direct paths), but cities seem loathe to adopt them vs just painting some lines on a current street.<p>Finally climate plays a huge part as well. I live where it gets over 100f in the suummer routinely. Even a 3 mile bike ride at that point means you&#x27;re drenched in sweat, which is just not acceptable in a majority of environments. Showers can be added but water is already a resource we&#x27;re flippant with when we really shouldn&#x27;t be.<p>Finally its not a great solution for the elderly and has some risks. Yes they can ride an e bike, but when you screw up at 20 mph in a sedan you wind up with a very expensive bill and an insurance premium hikes. When you crash on a bike, even with saftey gear, you can wind up pretty seriously injured.<p>None of this isnt to say we should not build more bike paths&#x2F;trains&#x2F;subways&#x2F;busses, but I dont think its a one solution fits all sort of thing.</text></item><item><author>LucasBrandt</author><text>Riding a bike doesn&#x27;t need to be the only or primary way that people get around in every city, all the time, for it to drastically improve quality of life for people living in cities the world over. Most people are capable of riding a bike - especially an e-bike - to get to where they&#x27;re going. It&#x27;s cheaper than a car, it can be faster than driving a car, and it is obviously better for both the global climate and the local environment.<p>More than half of all daily trips in the US are less than 3 miles[1]. If cities give people the option to ride a bike for those shorter trips without feeling unsafe, a lot of people will ride bikes to complete those trips. That&#x27;s shown over and over again. Change is hard, but a future with less cars and more trips by bike and public transportation is better, and possible.<p>[1] <a href="https:&#x2F;&#x2F;www.energy.gov&#x2F;eere&#x2F;vehicles&#x2F;articles&#x2F;fotw-1230-march-21-2022-more-half-all-daily-trips-were-less-three-miles-2021" rel="nofollow">https:&#x2F;&#x2F;www.energy.gov&#x2F;eere&#x2F;vehicles&#x2F;articles&#x2F;fotw-1230-marc...</a></text></item></parent_chain><comment><author>jonas21</author><text>I think that&#x27;s their point. Even if bicycles aren&#x27;t a solution for all, everyone can still benefit from more trips happening by bicycle. The elderly or those with a commute that would leave them unacceptably sweaty can still drive, and they will enjoy less traffic, cleaner air, and an easier time finding parking thanks to the cyclists.</text></comment> |
15,600,665 | 15,600,790 | 1 | 2 | 15,600,031 | train | <story><title>WhatTheFont – Shazam for Fonts</title><url>http://www.myfonts.com/WhatTheFont/</url></story><parent_chain><item><author>pbhjpbhj</author><text>AFAIK it&#x27;s the best of its sort, but I&#x27;ve used it a few times (not for a few years though) and I don&#x27;t think I&#x27;ve ever found the actual font. Though of course it finds similar fonts which sometimes can be enough.</text></item></parent_chain><comment><author>lbotos</author><text>My secret used to be <a href="http:&#x2F;&#x2F;www.typophile.com&#x2F;" rel="nofollow">http:&#x2F;&#x2F;www.typophile.com&#x2F;</a> which had a font id forum and you&#x27;d get an answer from someone usually within a few hours. It was a really cool place.<p>They were down for a while so I don&#x27;t know if that community re-formed or not.</text></comment> | <story><title>WhatTheFont – Shazam for Fonts</title><url>http://www.myfonts.com/WhatTheFont/</url></story><parent_chain><item><author>pbhjpbhj</author><text>AFAIK it&#x27;s the best of its sort, but I&#x27;ve used it a few times (not for a few years though) and I don&#x27;t think I&#x27;ve ever found the actual font. Though of course it finds similar fonts which sometimes can be enough.</text></item></parent_chain><comment><author>weinzierl</author><text>&gt; I don&#x27;t think I&#x27;ve ever found the actual font.<p>I did, sometimes. When it works, it works great, but often it doesn&#x27;t. There are also <i>Matcherator</i>[1] and <i>What Font is</i>[2], but they are not better or worse in my admittedly limited experience.<p>[1] <a href="https:&#x2F;&#x2F;www.fontspring.com&#x2F;matcherator" rel="nofollow">https:&#x2F;&#x2F;www.fontspring.com&#x2F;matcherator</a><p>[2] <a href="https:&#x2F;&#x2F;www.whatfontis.com&#x2F;" rel="nofollow">https:&#x2F;&#x2F;www.whatfontis.com&#x2F;</a></text></comment> |
29,131,719 | 29,131,693 | 1 | 2 | 29,131,112 | train | <story><title>American trains need more than railfan nostalgia</title><url>https://www.slowboring.com/p/amtrak-review</url></story><parent_chain><item><author>kumarsw</author><text>Somewhat tangentially, I watched a recent Not Just Bikes YouTube video that discusses the merits of commuter rail [1]. The video is obviously trying to discourage infrastructure that supports car-based suburbs, but another point came out that struck me: commuter rail is very expensive infrastructure with very low utilization, say 4 incoming trains in the morning and 4 outgoing trains in the evening. Any public transportation option needs to be compared against the default choice of &quot;run more buses,&quot; which it turns out is pretty hard to beat in terms of cost-effectiveness.<p>[1] <a href="https:&#x2F;&#x2F;youtu.be&#x2F;vxWjtpzCIfA" rel="nofollow">https:&#x2F;&#x2F;youtu.be&#x2F;vxWjtpzCIfA</a></text></item></parent_chain><comment><author>TulliusCicero</author><text>It depends on the commuter rail. The S-Bahn in Germany is sort of &#x27;commuter rail&#x27; but also gets regular use for other types of trips. And actual commuter-specific train lines tend to go on just general tracks that are used for other lines as well, so it&#x27;s not a huge deal.<p>It&#x27;s just that US-style development -- more or less mandated by the government -- makes public transit in general infeasible. For example, in Munich where I lived, there are suburbs nearby surrounded completely by farmland, where the whole town is still a handful of minutes by bike to the train station. That means that even in these sleepy little towns, it&#x27;s dense enough to where a car is unnecessary. In the US, that kind of development is a rarity, because the local zoning makes it literally illegal, and the roads have been designed to be hostile to anything that&#x27;s not a car.</text></comment> | <story><title>American trains need more than railfan nostalgia</title><url>https://www.slowboring.com/p/amtrak-review</url></story><parent_chain><item><author>kumarsw</author><text>Somewhat tangentially, I watched a recent Not Just Bikes YouTube video that discusses the merits of commuter rail [1]. The video is obviously trying to discourage infrastructure that supports car-based suburbs, but another point came out that struck me: commuter rail is very expensive infrastructure with very low utilization, say 4 incoming trains in the morning and 4 outgoing trains in the evening. Any public transportation option needs to be compared against the default choice of &quot;run more buses,&quot; which it turns out is pretty hard to beat in terms of cost-effectiveness.<p>[1] <a href="https:&#x2F;&#x2F;youtu.be&#x2F;vxWjtpzCIfA" rel="nofollow">https:&#x2F;&#x2F;youtu.be&#x2F;vxWjtpzCIfA</a></text></item></parent_chain><comment><author>jeffbee</author><text>That sounds like the worst rail service imaginable. Very American.<p>The Swiss town of Chur, which has the same population as Menlo Park, California, and which is a good distance from Zurich, as Menlo Park is from San Francisco, has five S-Bahn trains every hour, all day long, plus 9 other regional, interregional, and intercity trains per hour.<p>There&#x27;s nothing fundamental about trains that says you have to have insulting levels of service. It&#x27;s a choice.</text></comment> |
26,623,274 | 26,622,695 | 1 | 2 | 26,621,351 | train | <story><title>The makers of Eleuther hope it will be an open source alternative to GPT-3</title><url>https://www.wired.com/story/ai-generate-convincing-text-anyone-use-it/</url></story><parent_chain></parent_chain><comment><author>minimaxir</author><text>As someone who works on a Python library solely devoted to making AI text generation more accessible to the normal person (<a href="https:&#x2F;&#x2F;github.com&#x2F;minimaxir&#x2F;aitextgen" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;minimaxir&#x2F;aitextgen</a> ) I think the headline is misleading.<p>Although the article focuses on the release of GPT-Neo, even GPT-2 released in 2019 was good at generating text, it just spat out a lot of garbage requiring curation, which GPT-3&#x2F;GPT-Neo still requires albeit with a better signal-to-noise ratio. Most GPT-3 demos on social media are survivorship bias. (in fact OpenAI&#x27;s rules for the GPT-3 API strongly encourage curating such output)<p>GPT-Neo, meanwhile, is such a big model that it requires a bit of data engineering work to get operating and generating text (see the README: <a href="https:&#x2F;&#x2F;github.com&#x2F;EleutherAI&#x2F;gpt-neo" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;EleutherAI&#x2F;gpt-neo</a> ), and it&#x27;s unclear currently if it&#x27;s as good as GPT-3, even when comparing models apples-to-apples (i.e. the 2.7B GPT-Neo with the &quot;ada&quot; GPT-3 via OpenAI&#x27;s API).<p>That said, Hugging Face is adding support for GPT-Neo to Transformers (<a href="https:&#x2F;&#x2F;github.com&#x2F;huggingface&#x2F;transformers&#x2F;pull&#x2F;10848" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;huggingface&#x2F;transformers&#x2F;pull&#x2F;10848</a> ) which will help make playing with the model easier, and I&#x27;ll add support to aitextgen if it pans out.</text></comment> | <story><title>The makers of Eleuther hope it will be an open source alternative to GPT-3</title><url>https://www.wired.com/story/ai-generate-convincing-text-anyone-use-it/</url></story><parent_chain></parent_chain><comment><author>everdrive</author><text>People already believe garbage at a pretty alarming rate. It&#x27;s easy to guess at a number of possible outcomes here:<p>- More junk text moves the public to doubt legitimate information even further than they currently do.<p>- There is so much human-generated junk text that adding more of it via AI actually doesn&#x27;t have much of an effect.<p>- People return to lean on experts, perhaps even more than before. (just as a number of tech-literate folks have now returned to relying on brand name.)<p>Speculation is easy of course, so who knows what will actually happen.</text></comment> |
29,464,824 | 29,463,794 | 1 | 2 | 29,460,074 | train | <story><title>Popular Family Safety App Life360 Selling Precise Location Data on Its Users</title><url>https://themarkup.org/privacy/2021/12/06/the-popular-family-safety-app-life360-is-selling-precise-location-data-on-its-tens-of-millions-of-user</url></story><parent_chain><item><author>Syonyk</author><text>Of <i>course</i> they are.<p><i>Everyone</i> is double dipping anymore. Pay for the app, but you&#x27;re really the product being sold. Any app that can get location access is probably doing something you don&#x27;t like with it. Your &quot;Smart TV&quot; is hoovering up as much as it can to send upstream (including using content recognition algorithms on the DVD and game console inputs), your &quot;connected car&quot; is almost certainly doing the same thing, your OSes are probably doing it... the whole of the modern tech industry is rotten, and nowhere is this more clear than the cell phone data mining industry.<p>There&#x27;s big money in this sort of backend data analytics... deception? deceit? Whatever you want to call it.<p>&gt; <i>He did confirm that X-Mode buys data from Life360 and that it is one of “approximately one dozen data partners.” Hulls added that the company would be supportive of legislation that would require public disclosure of such partners.</i><p>You don&#x27;t &quot;accidentally&quot; end up with a dozen data partners unless your goal is data collection and sale.<p>I&#x27;ll offer a guiding principle that&#x27;s been serving me well: If you can imagine how the data is being abused, it certainly is. If you can&#x27;t come up with a possible way for some bit of data to be abused... someone else is more creative than you and has found a way to monetize it.</text></item></parent_chain><comment><author>x0x0</author><text>If I were cynical, I&#x27;d suspect Life 360 (and similar companies) were created first with the goal of selling location data, and from there worked backwards to figure out what sort of app&#x2F;company would require a continuous stream of location data. And worked from there. Which is becoming more valuable as ios and android give users more granular control over apps getting location data...</text></comment> | <story><title>Popular Family Safety App Life360 Selling Precise Location Data on Its Users</title><url>https://themarkup.org/privacy/2021/12/06/the-popular-family-safety-app-life360-is-selling-precise-location-data-on-its-tens-of-millions-of-user</url></story><parent_chain><item><author>Syonyk</author><text>Of <i>course</i> they are.<p><i>Everyone</i> is double dipping anymore. Pay for the app, but you&#x27;re really the product being sold. Any app that can get location access is probably doing something you don&#x27;t like with it. Your &quot;Smart TV&quot; is hoovering up as much as it can to send upstream (including using content recognition algorithms on the DVD and game console inputs), your &quot;connected car&quot; is almost certainly doing the same thing, your OSes are probably doing it... the whole of the modern tech industry is rotten, and nowhere is this more clear than the cell phone data mining industry.<p>There&#x27;s big money in this sort of backend data analytics... deception? deceit? Whatever you want to call it.<p>&gt; <i>He did confirm that X-Mode buys data from Life360 and that it is one of “approximately one dozen data partners.” Hulls added that the company would be supportive of legislation that would require public disclosure of such partners.</i><p>You don&#x27;t &quot;accidentally&quot; end up with a dozen data partners unless your goal is data collection and sale.<p>I&#x27;ll offer a guiding principle that&#x27;s been serving me well: If you can imagine how the data is being abused, it certainly is. If you can&#x27;t come up with a possible way for some bit of data to be abused... someone else is more creative than you and has found a way to monetize it.</text></item></parent_chain><comment><author>ljm</author><text>Would not surprise me if there were people in these businesses who thought, &quot;why would we leave FREE money on the table?&quot; A promotion or two later after palming off their customer&#x27;s data to whoever had the cash handy, they&#x27;re off to do the same at another business.<p>I think they&#x27;re called MBAs but I don&#x27;t think it&#x27;s unique to that style of professional.</text></comment> |
18,646,847 | 18,646,905 | 1 | 3 | 18,646,485 | train | <story><title>Open Location Code</title><url>https://github.com/google/open-location-code</url></story><parent_chain><item><author>nabla9</author><text>Reinventing the wheel without any clear advantage.<p>Plus codes don&#x27;t seem to have any advantage over widely used MGRS (Military Grid Reference System). <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Military_Grid_Reference_System" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Military_Grid_Reference_System</a></text></item></parent_chain><comment><author>chronial</author><text>I find them a lot more human-friendly, but I guess that&#x27;s subjective. The example from the wiki page you linked:<p>MGRS: 4QFJ12345678<p>Plus Code: 835M+Q7 Honolulu</text></comment> | <story><title>Open Location Code</title><url>https://github.com/google/open-location-code</url></story><parent_chain><item><author>nabla9</author><text>Reinventing the wheel without any clear advantage.<p>Plus codes don&#x27;t seem to have any advantage over widely used MGRS (Military Grid Reference System). <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Military_Grid_Reference_System" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Military_Grid_Reference_System</a></text></item></parent_chain><comment><author>phicoh</author><text>In my opinion, a big disavantage of MGRS is that it is based on UTM. These days nobody uses UTM and UTM is hard to grasp if all you are used to is lat&#x2F;lon coordinates.<p>So the clear advantage of Plus Codes over MGRS is that it is based on lat&#x2F;lon</text></comment> |
15,258,940 | 15,258,587 | 1 | 3 | 15,254,483 | train | <story><title>“There Is No Reason for Any Individual to Have a Computer in Their Home”</title><url>https://quoteinvestigator.com/2017/09/14/home-computer/</url></story><parent_chain><item><author>simonh</author><text>&gt;Gaming is maybe the one mainstream application where local compute is important.<p>There are a ton of things our smartphones and tablets do that rely on very high levels of local computing power.<p>Biometric authentication such as face and fingerprint recognition is a heavily compute intensive application. It needs to be done locally in order to be secure. Face ID is just nuts.<p>Modern smartphone photography is heavily compute-intensive, from HDR to optical zoom to image editing and markup. Just look at what Apple is doing with portrait mode and lighting effects. I love using time-lapse and slow-motion video modes. What would instagram and snapchat be without locally applied filters?<p>Note taking, address book and time management. The notes apps on modern mobile devices are multi-media wonders with integrated text, image and handwriting recognition built-in. Calendar management, alarms and notifications all rely on local processing even if they do make use of cloud services. Without local smarts those services would lose a huge chunk of their utility.<p>Document and media management. I read eBooks and listen to podcasts. My ebook reader has dynamic font selection, text size adjustment and will even read my books to me. Managing media on the device is essential as contemporary networks are still nowhere near good enough to stream everything all the time. My podcast app has sound modulation and processing options built in to tailor the sound to my tastes and needs, including speed up, voice level adjustment and dead-air &#x27;vocal pause&#x27; elimination in real-time, all on-device and adjustable at my fingertips. That&#x27;s serious sound studio level stuff in my pocket.<p>Playing digital video files. Even ones downloaded or streamed. So what if we&#x27;ve had it since the 90s. It&#x27;s still proper computation, especially with advanced modern codecs. VOIP and video calling even between continents have also become everyday and absolutely rely on powerful local number crunching.<p>These things have become so everyday that we hardly notice most of them, but without serious on-device processing power, some of which would have been beyond $10k workstations just 15 years ago, none of this would be possible.</text></item><item><author>gregmac</author><text>&gt; To understand the mindset of this period it is important to recognize the distinction between a computer terminal and a free-standing computer. Some experts believed that individuals would have terminals at home that communicated with powerful remote computers providing utility-like services for information and interaction. These experts believed that an isolated computer at home would be too under-powered to be worthwhile.<p>Considering how the majority of the general population&#x27;s use of &quot;computers&quot; is really interacting with remote and cloud services (Facebook, Netflix, etc) this is a valid argument today. While today&#x27;s equivalent of terminals (phones, tablets, and even PCs) have a significant amount of local computing ability, for the most part they really are just used as fancy terminals. Gaming is maybe the one mainstream application where local compute is important.<p>The other quote that stuck out to me was this:<p>&gt; “The personal computer will fall flat on its face in business because users want to share files and want more than one user on the system,”</text></item></parent_chain><comment><author>com2kid</author><text>&gt; My podcast app has sound modulation and processing options built in to tailor the sound to my tastes and needs, including speed up, voice level adjustment and dead-air &#x27;vocal pause&#x27; elimination in real-time, all on-device and adjustable at my fingertips.<p>Everything short of the vocal pause stuff an old Sansa will do for you, running a tiny embedded C based runtime.<p>&gt; These things have become so everyday that we hardly notice most of them, but without serious on-device processing power, some of which would have been beyond $10k workstations just 15 years ago, none of this would be possible.<p>The modern photo and video stuff is super cool and uses a ton of processing power. Likewise for high quality video streaming, decompressing is hard work, though often offloaded to HW these days.<p>Everything else my desktop of 15 years ago handled just fine, with 256MB of RAM.<p>Heck my smartphone in 2006 with 64MB of RAM did most of this stuff.<p>The one thing that has remained constant across a decade of smart phone development is the Pandora app getting forced closed all the time.</text></comment> | <story><title>“There Is No Reason for Any Individual to Have a Computer in Their Home”</title><url>https://quoteinvestigator.com/2017/09/14/home-computer/</url></story><parent_chain><item><author>simonh</author><text>&gt;Gaming is maybe the one mainstream application where local compute is important.<p>There are a ton of things our smartphones and tablets do that rely on very high levels of local computing power.<p>Biometric authentication such as face and fingerprint recognition is a heavily compute intensive application. It needs to be done locally in order to be secure. Face ID is just nuts.<p>Modern smartphone photography is heavily compute-intensive, from HDR to optical zoom to image editing and markup. Just look at what Apple is doing with portrait mode and lighting effects. I love using time-lapse and slow-motion video modes. What would instagram and snapchat be without locally applied filters?<p>Note taking, address book and time management. The notes apps on modern mobile devices are multi-media wonders with integrated text, image and handwriting recognition built-in. Calendar management, alarms and notifications all rely on local processing even if they do make use of cloud services. Without local smarts those services would lose a huge chunk of their utility.<p>Document and media management. I read eBooks and listen to podcasts. My ebook reader has dynamic font selection, text size adjustment and will even read my books to me. Managing media on the device is essential as contemporary networks are still nowhere near good enough to stream everything all the time. My podcast app has sound modulation and processing options built in to tailor the sound to my tastes and needs, including speed up, voice level adjustment and dead-air &#x27;vocal pause&#x27; elimination in real-time, all on-device and adjustable at my fingertips. That&#x27;s serious sound studio level stuff in my pocket.<p>Playing digital video files. Even ones downloaded or streamed. So what if we&#x27;ve had it since the 90s. It&#x27;s still proper computation, especially with advanced modern codecs. VOIP and video calling even between continents have also become everyday and absolutely rely on powerful local number crunching.<p>These things have become so everyday that we hardly notice most of them, but without serious on-device processing power, some of which would have been beyond $10k workstations just 15 years ago, none of this would be possible.</text></item><item><author>gregmac</author><text>&gt; To understand the mindset of this period it is important to recognize the distinction between a computer terminal and a free-standing computer. Some experts believed that individuals would have terminals at home that communicated with powerful remote computers providing utility-like services for information and interaction. These experts believed that an isolated computer at home would be too under-powered to be worthwhile.<p>Considering how the majority of the general population&#x27;s use of &quot;computers&quot; is really interacting with remote and cloud services (Facebook, Netflix, etc) this is a valid argument today. While today&#x27;s equivalent of terminals (phones, tablets, and even PCs) have a significant amount of local computing ability, for the most part they really are just used as fancy terminals. Gaming is maybe the one mainstream application where local compute is important.<p>The other quote that stuck out to me was this:<p>&gt; “The personal computer will fall flat on its face in business because users want to share files and want more than one user on the system,”</text></item></parent_chain><comment><author>Theodores</author><text>Years ago when I had a VAX to work with, the terminal was happy with user name and password. Your supercomputer phone might have classy iris scanning stuff but it is merely unlocking the terminal. Okay there is no remote authentication to it but it is the same conceptually.<p>Everything else apart from things like the clock are client server.</text></comment> |
30,875,987 | 30,875,671 | 1 | 3 | 30,875,259 | train | <story><title>Infinite Mac: An Instant-Booting Quadra in the Browser</title><url>https://blog.persistent.info/2022/03/blog-post.html</url></story><parent_chain><item><author>ilrwbwrkhv</author><text>I swear. What happened here? Like why do things feel sluggish than software over 20 years ago? I just can&#x27;t wrap my mind around it. We have SSDs and infinite CPU and RAM now compared to those days, yet there is always this feeling the computer is doing something more than I asked for behind the scenes. Even on websites, you can feel when you click on a link, it is doing something more. It&#x27;s a very uncomfortable feeling.</text></item><item><author>andai</author><text>When I want to get work done quickly, I boot up Windows XP in a VM. It&#x27;s just so damn snappy it&#x27;s unbelievable.<p>What Moore giveth, Gates taketh away...</text></item><item><author>ilrwbwrkhv</author><text>It feels so fast. Even in the browser. Modern OSes just feels so sluggish by comparison.</text></item></parent_chain><comment><author>edwcross</author><text>We transitioned from a &quot;high-trust society&quot; (where software, Internet, etc, did not need to take into account so many security considerations, protect itself from enemies everywhere) into a &quot;low-trust society&quot; (encryption everywhere, hardware mitigations for Spectre-kind attacks, Javascript-based tracking, etc).<p>The overhead is similar to the one people face in third-world countries: you pay once for the government (taxes), then again for basic services (private education, private security, private health, private transportation), and you still end up in a worse situation than in a (high-trust) first-world country.<p>I tried using a Core 2 Duo laptop recently, but the simple fact that HTTPS is mandatory nowadays, and Core 2 doesn&#x27;t have AES-NI, makes the laptop spin its fan much more than it should, even if I try to use an old Linux&#x2F;Windows.</text></comment> | <story><title>Infinite Mac: An Instant-Booting Quadra in the Browser</title><url>https://blog.persistent.info/2022/03/blog-post.html</url></story><parent_chain><item><author>ilrwbwrkhv</author><text>I swear. What happened here? Like why do things feel sluggish than software over 20 years ago? I just can&#x27;t wrap my mind around it. We have SSDs and infinite CPU and RAM now compared to those days, yet there is always this feeling the computer is doing something more than I asked for behind the scenes. Even on websites, you can feel when you click on a link, it is doing something more. It&#x27;s a very uncomfortable feeling.</text></item><item><author>andai</author><text>When I want to get work done quickly, I boot up Windows XP in a VM. It&#x27;s just so damn snappy it&#x27;s unbelievable.<p>What Moore giveth, Gates taketh away...</text></item><item><author>ilrwbwrkhv</author><text>It feels so fast. Even in the browser. Modern OSes just feels so sluggish by comparison.</text></item></parent_chain><comment><author>naikrovek</author><text>&gt; What happened here?<p>three things, I think:<p>1) people saying (and believing) that CPU, RAM, and disk are fast enough and cheap enough to be effectively infinite, and no longer worthy of concern. put another way: developers feeling that their target platforms are no longer constrained.<p>2) many software developers simply stopped caring about performance and began implementing all kinds of things extremely naively. new developers do this continually for many years early in their careers.<p>3) JavaScript. the places this language is used today, where it should never, ever be used. excruciatingly slow language. and JavaScript is EVERYWHERE, even on extremely constrained platforms, doing work, when almost any other language would have been a better choice.</text></comment> |
24,708,438 | 24,708,164 | 1 | 2 | 24,704,386 | train | <story><title>A socialite who hated washing dishes invented the automated dishwasher</title><url>https://spectrum.ieee.org/the-institute/ieee-history/this-socialite-hated-washing-dishes-so-much-that-she-invented-the-automated-dishwasher</url></story><parent_chain><item><author>prawn</author><text>It amazes me that after widespread adoption of dishwashers, we still build very customised bathrooms into houses that require manual labour to clean each surface - tiles&#x2F;grout, basins, benchtops, toilets, floors, etc.<p>A bathroom that could be sealed and cleaned as though it was a dishwasher would be great. For everything but the toilet, lightly pressured spray of hot water would probably be sufficient. Throw the towels in the wash, close the waterproofed door, press the cleaning button and head off to work. Then the door&#x2F;vents open to air it out after the cleaning process. There are public toilets that do it so surely we could make an attractive home version of that?<p>And if not the entire bathroom, at least have self-cleaning toilets rather than almost every single model being a slightly different shape so there&#x27;s no standardised solution better than a toilet brush.<p>The lifetime cost of paid cleaners or just cleaning the bathroom yourself has to be worth something that could be invested into a bathroom upgrade when building.</text></item></parent_chain><comment><author>vel0city</author><text>It seems a bit counter-intuitive, but dishwashers are not necessarily self-cleaning. Dishwasher owners really should spend some time every now and then to scrub and clean out their dishwashers, along with running some kind of cleaner through their dishwasher. Mold, scale, and food debris will eventually build up throughout the dishwasher, even if you try to let it air out completely. Proper cleaning and maintenance of your dishwasher will help it last a lot longer than assuming it is self cleaning. I should know, I recently came into ownership of a 12 year old dishwasher which &quot;didn&#x27;t work&quot;. Scrubbing and cleaning the machine led to the machine working for quite a while after.</text></comment> | <story><title>A socialite who hated washing dishes invented the automated dishwasher</title><url>https://spectrum.ieee.org/the-institute/ieee-history/this-socialite-hated-washing-dishes-so-much-that-she-invented-the-automated-dishwasher</url></story><parent_chain><item><author>prawn</author><text>It amazes me that after widespread adoption of dishwashers, we still build very customised bathrooms into houses that require manual labour to clean each surface - tiles&#x2F;grout, basins, benchtops, toilets, floors, etc.<p>A bathroom that could be sealed and cleaned as though it was a dishwasher would be great. For everything but the toilet, lightly pressured spray of hot water would probably be sufficient. Throw the towels in the wash, close the waterproofed door, press the cleaning button and head off to work. Then the door&#x2F;vents open to air it out after the cleaning process. There are public toilets that do it so surely we could make an attractive home version of that?<p>And if not the entire bathroom, at least have self-cleaning toilets rather than almost every single model being a slightly different shape so there&#x27;s no standardised solution better than a toilet brush.<p>The lifetime cost of paid cleaners or just cleaning the bathroom yourself has to be worth something that could be invested into a bathroom upgrade when building.</text></item></parent_chain><comment><author>rsynnott</author><text>As you say, this is already a thing for public toilets: <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Sanisette" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Sanisette</a><p>For a whole bathroom, it would be quite expensive; you&#x27;d need a fully watertight room (most bathrooms, even wetrooms, are not) and you&#x27;d risk contamination. Would also be difficult to retrofit unless you had a wetroom already. Also the failure modes are quite bad; what happens when the drains clog? You&#x27;d need the whole thing sunken to prevent water egress to the rest of the house.<p>Definitely not saying it&#x27;s impossible, or even impractical, but it wouldn&#x27;t be cheap.</text></comment> |
41,573,300 | 41,572,887 | 1 | 2 | 41,571,911 | train | <story><title>The Double Irish Dutch Sandwich: End of a Tax Evasion Strategy</title><url>https://conversableeconomist.com/2024/09/12/the-double-irish-dutch-sandwich-end-of-a-tax-evasion-strategy/</url></story><parent_chain></parent_chain><comment><author>gwd</author><text>&gt; The obvious loss is to government revenue, but the more subtle and still very real loss is the diversion of high-powered talent from what could have been gains in efficiency and productivity to focus instead on corporate reorganizations and tax evasion games.<p>It&#x27;s also a loss to all of us who find ourselves owning a &quot;foreign&quot; corporation for legitimate reasons, and having to do a lot of work proving that we&#x27;re not playing shell games.<p>I&#x27;m an American living in the UK; I&#x27;ve started what&#x27;s now a one-man bootstrapped company and it just makes sense for it to be incorporated in the UK. I don&#x27;t mind paying taxes, but I don&#x27;t want to have to pay them twice. UK taxes are generally higher than US taxes, and there are tax treaties to avoid double taxation, so no problem, right?<p>Wrong. First, if a US citizen owns more than 10% of a foreign corporation, there&#x27;s an insanely complicated form you have to fill out, two pages of which are incomprehensible questions (&quot;Is your company considered an X corporation under sections Y and Z of the Blablablah act?&quot;) that simply require an expert to fill out.<p>Secondly, if a US citizens owns more than <i>50%</i> of a foreign company, by default, income from that company counts as personal income, <i>but taxes paid by the company to another government don&#x27;t count as personal taxes</i>. You can get around this, apparently, but it&#x27;s even more complicated.<p>So because of [REDACTED] lawyers and accountants that have come up with these schemes, I have to spend precious starting capital to a bunch of accountants to prove I&#x27;m not evading taxes, rather than actually doing something useful with that money (and that accountant doing something more useful with their time).</text></comment> | <story><title>The Double Irish Dutch Sandwich: End of a Tax Evasion Strategy</title><url>https://conversableeconomist.com/2024/09/12/the-double-irish-dutch-sandwich-end-of-a-tax-evasion-strategy/</url></story><parent_chain></parent_chain><comment><author>sanj</author><text>“ The obvious loss is to government revenue, but the more subtle and still very real loss is the diversion of high-powered talent from what could have been gains in efficiency and productivity to focus instead on corporate reorganizations and tax evasion games.”<p>I’d expect very little overlap between those talent pools.</text></comment> |
28,908,035 | 28,907,948 | 1 | 3 | 28,907,154 | train | <story><title>Tests aren’t enough: Case study after adding type hints to urllib3</title><url>https://sethmlarson.dev/blog/2021-10-18/tests-arent-enough-case-study-after-adding-types-to-urllib3</url></story><parent_chain><item><author>m12k</author><text>I&#x27;ve only been in the industry for ~15 years, but it still feels like every year, some ecosystem discovers the value of something that another ecosystem has taken for granted for decades - type-checking, immutability, unidirectional data-flow, AOT-compilation, closures, pure functions, you name it. I&#x27;m glad we seem to be converging on a set of best practices as an industry, but sometimes I wish we were spending less time rediscovering the wheel and more time building on top of and adding to the actual state of the art.</text></item></parent_chain><comment><author>taeric</author><text>I&#x27;ve been alive long enough to see that most things are useful, and all things are oversold.<p>More, the nice easy things to build with major restrictions pretty much gets thrown out the window for complicated things that have constraints that most efforts don&#x27;t have. This isn&#x27;t just a software thing. Building a little shed outside? Would be silly to use the same rigor that goes into a high rise. Which would be crazy to use the same materials engineering that goes into a little shed.</text></comment> | <story><title>Tests aren’t enough: Case study after adding type hints to urllib3</title><url>https://sethmlarson.dev/blog/2021-10-18/tests-arent-enough-case-study-after-adding-types-to-urllib3</url></story><parent_chain><item><author>m12k</author><text>I&#x27;ve only been in the industry for ~15 years, but it still feels like every year, some ecosystem discovers the value of something that another ecosystem has taken for granted for decades - type-checking, immutability, unidirectional data-flow, AOT-compilation, closures, pure functions, you name it. I&#x27;m glad we seem to be converging on a set of best practices as an industry, but sometimes I wish we were spending less time rediscovering the wheel and more time building on top of and adding to the actual state of the art.</text></item></parent_chain><comment><author>axiosgunnar</author><text>I think the problem is to figure out what the best practices actually are.<p>What we are observing here is „the market fixing it“.<p>The process is messy and redundant, but effective.</text></comment> |
19,921,625 | 19,921,376 | 1 | 2 | 19,920,539 | train | <story><title>FCC Chairman Proposes Robocall Blocking by Default</title><url>https://www.fcc.gov/document/chairman-pai-proposes-robocall-blocking-default</url></story><parent_chain><item><author>hedvig</author><text>The more I look at things as I age, the more I see market based solutions as only benefiting the supplier and always hurting the consumer.</text></item><item><author>duxup</author><text>According to Ars they want to leave it up to the carriers if they want to charge for it, if that is accurate and the robocall problems escalate then you could end up paying for it by default ... or just get robocalls all day.<p><a href="https:&#x2F;&#x2F;arstechnica.com&#x2F;tech-policy&#x2F;2019&#x2F;05&#x2F;ajit-pais-robocall-plan-lets-carriers-charge-for-new-call-blocking-tools&#x2F;" rel="nofollow">https:&#x2F;&#x2F;arstechnica.com&#x2F;tech-policy&#x2F;2019&#x2F;05&#x2F;ajit-pais-roboca...</a><p>&quot;It will cost $X a month for service, and $Y more if you want to be able to use it...&quot;<p>And as a revenue stream I worry this could create a perverse disincentive for carriers, why deal with a problem that makes people give us more money?</text></item></parent_chain><comment><author>wpietri</author><text>Honestly, it&#x27;s not clear to me that most &quot;pro-market&quot; advocates even understand markets. They work best when purchasers have many options, clear information, easy switching, and equal power to sellers. But at least in the US, most of the &quot;pro-market&quot; voices in practice seem to be in favor of oligopoly and the absolute right of businesses to exploit information, wealth, and power asymmetries.<p>It makes me a bit bonkers, because well-designed markets can do an amazing job solving optimization problems.</text></comment> | <story><title>FCC Chairman Proposes Robocall Blocking by Default</title><url>https://www.fcc.gov/document/chairman-pai-proposes-robocall-blocking-default</url></story><parent_chain><item><author>hedvig</author><text>The more I look at things as I age, the more I see market based solutions as only benefiting the supplier and always hurting the consumer.</text></item><item><author>duxup</author><text>According to Ars they want to leave it up to the carriers if they want to charge for it, if that is accurate and the robocall problems escalate then you could end up paying for it by default ... or just get robocalls all day.<p><a href="https:&#x2F;&#x2F;arstechnica.com&#x2F;tech-policy&#x2F;2019&#x2F;05&#x2F;ajit-pais-robocall-plan-lets-carriers-charge-for-new-call-blocking-tools&#x2F;" rel="nofollow">https:&#x2F;&#x2F;arstechnica.com&#x2F;tech-policy&#x2F;2019&#x2F;05&#x2F;ajit-pais-roboca...</a><p>&quot;It will cost $X a month for service, and $Y more if you want to be able to use it...&quot;<p>And as a revenue stream I worry this could create a perverse disincentive for carriers, why deal with a problem that makes people give us more money?</text></item></parent_chain><comment><author>duxup</author><text>I really wish there w as a party interested in free markets and fostering competition.<p>Rather the GoP whose rhetoric talks about free markets, is really just all about picking winners themselves and legislating profits for their allies.</text></comment> |
5,348,392 | 5,347,795 | 1 | 2 | 5,347,374 | train | <story><title>Extreme debugging - a tale of microcode and an oven</title><url>http://alanwinfield.blogspot.com.br/2013/03/extreme-debugging-tale-of-microcode-and.html</url></story><parent_chain></parent_chain><comment><author>xradionut</author><text>I've worked at a couple of jobs (telecom, semiconductors) with environment chambers and equipment of that ilk. Heat guns were a common bench testing tool.<p>Fast forward ten years and I'm doing contract work for a local company. Of course the contractor got the shittiest workstation, which dies every afternoon like clockwork. So I move the POS white box away from the window and out of the sunlight. "Magically" it only crashes once week. So I send a ticket into the queue, stating the obvious, then the IT folks quibbled and finally admit the motherboard needs to be replaced.</text></comment> | <story><title>Extreme debugging - a tale of microcode and an oven</title><url>http://alanwinfield.blogspot.com.br/2013/03/extreme-debugging-tale-of-microcode-and.html</url></story><parent_chain></parent_chain><comment><author>konstruktor</author><text>Beautiful example of great engineering. I like how they, at least from how the story is told, they first formed a hypothesis based on their knowledge, found a systematic way to test/particularise it and then built a fix for production in a separate step.</text></comment> |
24,601,520 | 24,601,122 | 1 | 3 | 24,599,642 | train | <story><title>Stop Asking Me to “Sign Up” (2014)</title><url>https://www.gkogan.co/blog/stop-asking-me-to-sign-up/</url></story><parent_chain><item><author>GekkePrutser</author><text>This article is from 2014, right now my annoyance is more all these apps with low self esteem.<p>Always this stupid question &quot;do you like this app&quot; and then redirecting me to the play store to vote if I say yes or the support site when I say no. And to make matters worse they keep doing it every few days. Once is passable, over and over is NOT.<p>I&#x27;ve started to give 1-star reviews to these apps. So sick of this behaviour. Especially many of Microsoft&#x27;s corporate apps do this (like outlook or teams) and most people download these because they need them, not because of some rating.<p>edit: I mean I do the 1-star thing with the apps which keep asking. If they ask once I just don&#x27;t do the review (If I really <i>love</i> the app I would have left a 5-star review anyway). But once it comes up again I do this.</text></item></parent_chain><comment><author>jakub_g</author><text>It&#x27;s a <i>really</i> important thing for the apps because by default only unhappy people go to Play Store and give 1 star. Then it looks like your app is crap and the competitors are way better.<p>In prev job, we had few ratings of our app, and avg. of 3.5. After adding the question, we got thousands of 5 stars and went to avg. 4.7 in a few months.<p>Of course the app should give you &quot;I don&#x27;t care&quot; option, and store your answer to not repeat the question.</text></comment> | <story><title>Stop Asking Me to “Sign Up” (2014)</title><url>https://www.gkogan.co/blog/stop-asking-me-to-sign-up/</url></story><parent_chain><item><author>GekkePrutser</author><text>This article is from 2014, right now my annoyance is more all these apps with low self esteem.<p>Always this stupid question &quot;do you like this app&quot; and then redirecting me to the play store to vote if I say yes or the support site when I say no. And to make matters worse they keep doing it every few days. Once is passable, over and over is NOT.<p>I&#x27;ve started to give 1-star reviews to these apps. So sick of this behaviour. Especially many of Microsoft&#x27;s corporate apps do this (like outlook or teams) and most people download these because they need them, not because of some rating.<p>edit: I mean I do the 1-star thing with the apps which keep asking. If they ask once I just don&#x27;t do the review (If I really <i>love</i> the app I would have left a 5-star review anyway). But once it comes up again I do this.</text></item></parent_chain><comment><author>ziml77</author><text>I don&#x27;t know how much teeth the Play Store rules have, but I wish it was against the ToS to display things like that. Or at least disallow having an option that expresses dissatisfaction while bypassing the Play Store review system, but sending any positive response along to it to boost the application&#x27;s rating.</text></comment> |
6,961,955 | 6,961,951 | 1 | 3 | 6,961,824 | train | <story><title>Google sues to protect Android device makers from Apple-backed patent hell</title><url>http://gigaom.com/2013/12/24/google-sues-to-protect-android-device-makers-from-apple-backed-patent-hell/</url></story><parent_chain></parent_chain><comment><author>namespace</author><text>Interesting times. There is no doubt that the legal punches thrown at Android have been anti-competitive and kills innovation instead of what there were meant to protect. For example reportedly Microsoft earns more from Android than selling Windows phone: <a href="http://www.zdnet.com/microsoft-is-making-2bn-a-year-on-android-licensing-five-times-more-than-windows-phone-7000022936/" rel="nofollow">http:&#x2F;&#x2F;www.zdnet.com&#x2F;microsoft-is-making-2bn-a-year-on-andro...</a>. If the patents were that useful for innovation, Microsoft would have been ahead of Google by miles.</text></comment> | <story><title>Google sues to protect Android device makers from Apple-backed patent hell</title><url>http://gigaom.com/2013/12/24/google-sues-to-protect-android-device-makers-from-apple-backed-patent-hell/</url></story><parent_chain></parent_chain><comment><author>mikhailt</author><text>Except Nortel did use many of those patents. They decided to sell it to Rockstar, full of owners who are likely licensing the patents to themselves for use in their products. Yes, technically Rockster is an individual company that doesn&#x27;t make&#x2F;sell any products but it&#x27;s not a troll just because of that. It&#x27;s a troll if the owners didn&#x27;t use it in their products.<p>By this logic, any standard bodies would be trolls as well if they decide to sue any companies that infringe without paying the fees.</text></comment> |
17,880,281 | 17,880,002 | 1 | 2 | 17,876,723 | train | <story><title>F-35 Program Cutting Corners to “Complete” Development</title><url>https://www.pogo.org/investigation/2018/08/f-35-program-cutting-corners-to-complete-development</url></story><parent_chain><item><author>Tobba_</author><text>The idea of drone swarms doesn&#x27;t go well together with aerodynamics and basic physical intuition. If you shrink an aircraft down, the aerodynamic cross-section (i.e the drag force) scales with the area (scale^2), but your engine thrust is going to drop roughly by the decrease in volume (scale^3).<p>So you end up losing maximum airspeed <i>and</i> fuel efficiency (in terms of the mass you&#x27;re moving) the smaller you go. Unless the drones in your swarm were really big, it doesn&#x27;t work out.<p>Although, I imagine we&#x27;ll see some smaller, unmanned jet fighters in the future (assuming someone figures out how to control something like that remotely, or autonomously). A smaller aircraft has the advantage of a smaller radar cross-section and being more difficult to hit. Doing away with the pilot cuts out a lot of weight and frees up room for a larger engine and fuel tank, offseting the downsides of the smaller size somewhat. There should be a sweet spot where that works out.</text></item><item><author>bertjk</author><text>I&#x27;ve thought that drone swarms are an &quot;obviously&quot; good idea, but that assumed that the communications&#x2F;navigation were not easily spoofed. Then Iran captured one of the US&#x27;s most advanced drone, pretty much intact. Now I&#x27;m not so sure.</text></item><item><author>Someone1234</author><text>Absolutely. And the depressing part is that SOME parts still could have been shared between the different aircraft even in a more modest proposal (e.g. warfare systems). They just decided to go that extra mile, and wind up with an aircraft that likely won&#x27;t ever save money and is too complex for its own good.<p>I hope we&#x27;re done with manned aircraft after the F-35, and it is all drone swarms from here on out.</text></item><item><author>sevensor</author><text>The biggest design problem is a failure of conception:<p>&gt; The Pentagon sold Congress on the F-35 in part with the idea of creating a common aircraft for three services, alleging it would save money—despite the well-documented and glaring failure of the tri-service F-111 program 25 years before Congress signed off on the very same plan with the F-35 in 2001<p>The idea that the STOVL variant would have significant parts commonality with the others despite having a giant fan in the middle was pretty laughable right from the start. In general, three aircraft with three very different design mission profiles were not going to be able to share much without making bad compromises.</text></item></parent_chain><comment><author>sawjet</author><text>Shape is the major determining factor for radar return, not size. From Ben Rich&#x27;s Skunk Works-<p>&gt;I really wanted a photographer around for historical purposes to capture the expression on Kelly’s big, brooding moon-shaped mug when I showed him the electromagnetic chamber results. Hopeless Diamond was exactly as Denys had predicted: a thousand times stealthier than the twelve-year-old drone. The fact that the test results matched Denys’s computer calculations was the first proof that we actually knew what in hell we were doing. Still, Kelly reacted about as graciously as a cop realizing he had collared the wrong suspect. He grudgingly flipped me the quarter and said, “Don’t spend it until you see the damned thing fly.” But then he sent for Denys Overholser and grilled the poor guy past the point of well-done on the whys and hows of stealth technology. He told me later that he was surprised to learn that with flat surfaces the amount of radar energy returning to the sender is independent of the target’s size. A small airplane, a bomber, an aircraft carrier, all with the same shape, will have identical radar cross sections. “By God, I never would have believed that,” he confessed. I had the feeling that maybe he still didn’t.</text></comment> | <story><title>F-35 Program Cutting Corners to “Complete” Development</title><url>https://www.pogo.org/investigation/2018/08/f-35-program-cutting-corners-to-complete-development</url></story><parent_chain><item><author>Tobba_</author><text>The idea of drone swarms doesn&#x27;t go well together with aerodynamics and basic physical intuition. If you shrink an aircraft down, the aerodynamic cross-section (i.e the drag force) scales with the area (scale^2), but your engine thrust is going to drop roughly by the decrease in volume (scale^3).<p>So you end up losing maximum airspeed <i>and</i> fuel efficiency (in terms of the mass you&#x27;re moving) the smaller you go. Unless the drones in your swarm were really big, it doesn&#x27;t work out.<p>Although, I imagine we&#x27;ll see some smaller, unmanned jet fighters in the future (assuming someone figures out how to control something like that remotely, or autonomously). A smaller aircraft has the advantage of a smaller radar cross-section and being more difficult to hit. Doing away with the pilot cuts out a lot of weight and frees up room for a larger engine and fuel tank, offseting the downsides of the smaller size somewhat. There should be a sweet spot where that works out.</text></item><item><author>bertjk</author><text>I&#x27;ve thought that drone swarms are an &quot;obviously&quot; good idea, but that assumed that the communications&#x2F;navigation were not easily spoofed. Then Iran captured one of the US&#x27;s most advanced drone, pretty much intact. Now I&#x27;m not so sure.</text></item><item><author>Someone1234</author><text>Absolutely. And the depressing part is that SOME parts still could have been shared between the different aircraft even in a more modest proposal (e.g. warfare systems). They just decided to go that extra mile, and wind up with an aircraft that likely won&#x27;t ever save money and is too complex for its own good.<p>I hope we&#x27;re done with manned aircraft after the F-35, and it is all drone swarms from here on out.</text></item><item><author>sevensor</author><text>The biggest design problem is a failure of conception:<p>&gt; The Pentagon sold Congress on the F-35 in part with the idea of creating a common aircraft for three services, alleging it would save money—despite the well-documented and glaring failure of the tri-service F-111 program 25 years before Congress signed off on the very same plan with the F-35 in 2001<p>The idea that the STOVL variant would have significant parts commonality with the others despite having a giant fan in the middle was pretty laughable right from the start. In general, three aircraft with three very different design mission profiles were not going to be able to share much without making bad compromises.</text></item></parent_chain><comment><author>Someone1234</author><text>&gt; If you shrink an aircraft down, the aerodynamic cross-section (i.e the drag force) scales with the area (scale^2), but your engine thrust is going to drop roughly by the decrease in volume (scale^3).<p>Why would we shrink drones down? You read drone swarm and assumed small, but most swarm proposals are using drones of a similar size as today or even large in some cases. They&#x27;re swarms because of the way they interact with one another, and overwhelm enemy defensive systems, not because they&#x27;re small.</text></comment> |
17,964,390 | 17,964,365 | 1 | 2 | 17,961,820 | train | <story><title>Project Python – Free Interactive Book That Introduces Python Programming and CS</title><url>http://projectpython.net/chapter00/</url></story><parent_chain></parent_chain><comment><author>znpy</author><text>In my humble opinion there really is no need for ANOTHER introductory book on python.<p>But there is a great, great need for REALLY well done books on intermediate python and advanced python.</text></comment> | <story><title>Project Python – Free Interactive Book That Introduces Python Programming and CS</title><url>http://projectpython.net/chapter00/</url></story><parent_chain></parent_chain><comment><author>treyfitty</author><text>I&#x27;ve been trying to learn python for a while now, and I have a grasp of it in a similar way that I have a grasp of golf- I know what to do, but I&#x27;m still missing a few fundamentals.<p>All these intro to X things seem to gloss over the high level stuff, but I&#x27;ve hit a few walls in certain concepts that I just can&#x27;t possibly Google. Maybe HN could help me out:<p>take a pandas dataframe. There&#x27;s a .describe() method that I can chain to get the mean of the columns in the data frame grouped by var1:<p>df.groupby(df.var1).describe().mean<p>But then, there&#x27;s also:<p>df.describe(df.var1).mean()<p>They largely do the same thing, but one gives me the mean across all variables, and the other one doesn&#x27;t.<p>Now, how do I possibly Google this question? One method has parentheses, the other doesn&#x27;t.... but It&#x27;s clear I&#x27;m missing something fundamental here that I thought an intro online resource should address... but none do.</text></comment> |
37,751,458 | 37,750,293 | 1 | 2 | 37,749,753 | train | <story><title>Nobel Prize in Physics Awarded to Agostini, Krausz, and L’Huillier</title><url>https://www.nobelprize.org/prizes/physics/2023/summary/</url></story><parent_chain><item><author>daoboy</author><text>An attosecond is to a second what a second is to the age of the universe.</text></item></parent_chain><comment><author>boringg</author><text>I thought you were being glib but no:
&quot;An attosecond is so short that that the number of them in one second is the same as the number of seconds that have elapsed since the universe came into existence, 13.8 billion years ago. On a more relatable scale, we can imagine a fash of light being sent from one end of a room to the opposite wall – this takes ten billion attoseconds.&quot;<p>That&#x27;s truly amazing that we can measure at that detail. Mind blowing actually.</text></comment> | <story><title>Nobel Prize in Physics Awarded to Agostini, Krausz, and L’Huillier</title><url>https://www.nobelprize.org/prizes/physics/2023/summary/</url></story><parent_chain><item><author>daoboy</author><text>An attosecond is to a second what a second is to the age of the universe.</text></item></parent_chain><comment><author>matsemann</author><text>I really enjoy the accompanied &quot;Popular science background&quot;-paper the Nobel Committee releases together with the awards. It&#x27;s linked on the page, but a direct link that explains the contributions of this award is here: <a href="https:&#x2F;&#x2F;www.nobelprize.org&#x2F;uploads&#x2F;2023&#x2F;10&#x2F;popular-physicsprize2023.pdf" rel="nofollow noreferrer">https:&#x2F;&#x2F;www.nobelprize.org&#x2F;uploads&#x2F;2023&#x2F;10&#x2F;popular-physicspr...</a></text></comment> |
18,308,882 | 18,308,878 | 1 | 2 | 18,307,708 | train | <story><title>Google drops plans for Berlin campus after protests</title><url>https://www.bbc.com/news/world-europe-45971538</url></story><parent_chain><item><author>bko</author><text>I think arbitrary restrictions on groups that can purchase or lease space in a neighborhood is bad for a free society. I guess people say it depends on what group you&#x27;re imposing. Now it&#x27;s Google which you may or may not like, but other groups can be harmed in the future. This is part of the whole NIMBY movement.<p>As long as the group is following laws and zoning restrictions, there should be nothing preventing them from joining a community</text></item><item><author>keiferski</author><text>I’m honestly kind of surprised at Google’s complete lack of cultural or diplomatic knowledge about Berlin. Kreuzberg is&#x2F;was the center of artistic culture in the city, and choosing to have a presence there, no matter how small, would clearly be viewed as a corporate American attack on local countercultural values. As another commenter stated, they could have avoided this entire issue by picking somewhere in Mitte or a western suburb.</text></item></parent_chain><comment><author>eropple</author><text>Nobody is restricting Google from doing anything. They are expressing--in a way that is part of, and frankly should be expected of being a member of, one of those &quot;free societies&quot; you mention--that Google is <i>unwelcome</i>, and no, nobody is obligated to make you feel <i>welcome</i>. There are cases where not making somebody welcome kinda makes you an asshole, but that does not, frankly, apply to anybody who&#x27;s got a stock ticker symbol.<p>This is &quot;you piss in the pool wherever you go, please stay away from our pool.&quot; This isn&#x27;t NIMBY in any meaningful way and the comparison is downright mendacious.</text></comment> | <story><title>Google drops plans for Berlin campus after protests</title><url>https://www.bbc.com/news/world-europe-45971538</url></story><parent_chain><item><author>bko</author><text>I think arbitrary restrictions on groups that can purchase or lease space in a neighborhood is bad for a free society. I guess people say it depends on what group you&#x27;re imposing. Now it&#x27;s Google which you may or may not like, but other groups can be harmed in the future. This is part of the whole NIMBY movement.<p>As long as the group is following laws and zoning restrictions, there should be nothing preventing them from joining a community</text></item><item><author>keiferski</author><text>I’m honestly kind of surprised at Google’s complete lack of cultural or diplomatic knowledge about Berlin. Kreuzberg is&#x2F;was the center of artistic culture in the city, and choosing to have a presence there, no matter how small, would clearly be viewed as a corporate American attack on local countercultural values. As another commenter stated, they could have avoided this entire issue by picking somewhere in Mitte or a western suburb.</text></item></parent_chain><comment><author>keiferski</author><text>I used to agree with you, but considering the fact that San Francisco has gone from “semi-affordable haven for weirdos and off-beat culture” to “single most expensive city in America filled primarily with tech workers and the homeless” in less than 30 years, I don’t any more. Semi-similar situations for lower Manhattan, Paris, East London, ad infinitum.<p>A place can have a unique culture that’s worth maintaining. How to maintain it - I don’t know. But converting everything to MegaCo offices probably isn’t the solution.</text></comment> |
17,429,500 | 17,429,147 | 1 | 2 | 17,427,560 | train | <story><title>Darpa invests $100M in a silicon compiler</title><url>https://www.eetimes.com/document.asp?doc_id=1333422</url></story><parent_chain><item><author>WalterBright</author><text>I designed the ABEL language back in the 80&#x27;s for compiling designs targeted at programmable logic arrays and gate arrays. It was very successful, but it died after a decade or so.<p>It&#x27;d probably be around today and up to date if it was open source. A shame it isn&#x27;t. I don&#x27;t even know who owns the rights to it these days, or if whoever owns it even knows they have the rights to it, due to spinoffs and mergers.</text></item><item><author>ur-whale</author><text>&quot;Most importantly, we have to change the culture of hardware design. Today, we don’t have open sharing … &quot;<p>This, to the 100th power.<p>The culture in the EDA industry is stuck in the 1950&#x27;s when it comes to collaboration and sharing, it&#x27;s very frustrating for newcomers and people who want to learn the trade.<p>As was pointed out by someone in another hardware related HN thread, what can you expect from an industry that is still stuck calling a component &quot;Intellectual Property&quot;?<p>The un-sharing is built into the very <i>names</i> used to describe things.</text></item></parent_chain><comment><author>seattleeng</author><text>I think you&#x27;d be surprised. I went to Caltech (graduated 2014), which is a fairly well known university for their Electrical Engineering program, and I learned ABEL in my Sophomore&#x2F;Junior year. My instructor, an admittedly old school hardware engineer, was in love with the language and had it as part of our upper level digital design curriculum for a few labs. FWIW, I think it was super intuitive and a hugely valuable learning tool. I suppose that doesn&#x27;t mean it isn&#x27;t &quot;dead&quot; for professional purposes, though.</text></comment> | <story><title>Darpa invests $100M in a silicon compiler</title><url>https://www.eetimes.com/document.asp?doc_id=1333422</url></story><parent_chain><item><author>WalterBright</author><text>I designed the ABEL language back in the 80&#x27;s for compiling designs targeted at programmable logic arrays and gate arrays. It was very successful, but it died after a decade or so.<p>It&#x27;d probably be around today and up to date if it was open source. A shame it isn&#x27;t. I don&#x27;t even know who owns the rights to it these days, or if whoever owns it even knows they have the rights to it, due to spinoffs and mergers.</text></item><item><author>ur-whale</author><text>&quot;Most importantly, we have to change the culture of hardware design. Today, we don’t have open sharing … &quot;<p>This, to the 100th power.<p>The culture in the EDA industry is stuck in the 1950&#x27;s when it comes to collaboration and sharing, it&#x27;s very frustrating for newcomers and people who want to learn the trade.<p>As was pointed out by someone in another hardware related HN thread, what can you expect from an industry that is still stuck calling a component &quot;Intellectual Property&quot;?<p>The un-sharing is built into the very <i>names</i> used to describe things.</text></item></parent_chain><comment><author>Tade0</author><text><i>It was very successful, but it died after a decade or so.</i><p>Perhaps not entirely. We had one lab session dedicated to it during my junior year in college. That was ten years ago but apparently they haven&#x27;t changed that[0] (course description in english at the bottom of the page).<p>[0] <a href="http:&#x2F;&#x2F;studia.elka.pw.edu.pl&#x2F;pl&#x2F;18L&#x2F;s&#x2F;eres&#x2F;eres&#x2F;wwersje$.startup?Z_ID_PRZEDMIOTU=PTCY" rel="nofollow">http:&#x2F;&#x2F;studia.elka.pw.edu.pl&#x2F;pl&#x2F;18L&#x2F;s&#x2F;eres&#x2F;eres&#x2F;wwersje$.sta...</a></text></comment> |
21,737,328 | 21,737,225 | 1 | 2 | 21,737,021 | train | <story><title>Hospital superbugs are evolving to survive hand sanitizers (2018)</title><url>https://arstechnica.com/science/2018/08/hospital-superbugs-are-evolving-to-survive-hand-sanitizers/</url></story><parent_chain><item><author>umvi</author><text>What&#x27;s to stop a mutation from making bacteria copper-immune?</text></item><item><author>godzillabrennus</author><text>We’ve known how to combat this for a while: <a href="http:&#x2F;&#x2F;theconversation.com&#x2F;copper-is-great-at-killing-superbugs-so-why-dont-hospitals-use-it-73103" rel="nofollow">http:&#x2F;&#x2F;theconversation.com&#x2F;copper-is-great-at-killing-superb...</a><p>Copper on the surface of hospital furniture should help a lot.</text></item></parent_chain><comment><author>IAmEveryone</author><text>The basic principle is that hand sanitizers are applied externally. They can be far more agressive than, say, antibiotics because they do not enter the body (or only in minimal quantities). The skin is a specialized, multi-layered protective structure that single-celled organisms just cannot replicate within their constraints (ability to absorb food, energy investment, etc). It&#x27;s not <i>quite</i> good enough to allow the sort of methods completely immune against evolved resistance (fire, sufficient radiation, etc.). But it&#x27;s far enough ahead to make it manageable.<p>So I wouldn&#x27;t worry too much. The worst outcome is likely to be hand-washing becoming a somewhat larger nuisance, a higher risk of skin irritation or some combination therefore. The whole procedure is probably ripe for some purely mechanical innovation such as high(ish) pressure water or air, considering that part hasn&#x27;t much changed in the last 200 years (although training and frequency have increased)</text></comment> | <story><title>Hospital superbugs are evolving to survive hand sanitizers (2018)</title><url>https://arstechnica.com/science/2018/08/hospital-superbugs-are-evolving-to-survive-hand-sanitizers/</url></story><parent_chain><item><author>umvi</author><text>What&#x27;s to stop a mutation from making bacteria copper-immune?</text></item><item><author>godzillabrennus</author><text>We’ve known how to combat this for a while: <a href="http:&#x2F;&#x2F;theconversation.com&#x2F;copper-is-great-at-killing-superbugs-so-why-dont-hospitals-use-it-73103" rel="nofollow">http:&#x2F;&#x2F;theconversation.com&#x2F;copper-is-great-at-killing-superb...</a><p>Copper on the surface of hospital furniture should help a lot.</text></item></parent_chain><comment><author>pizza234</author><text>Answer in the article:<p>&gt; [...] copper and its alloys exhibit these impressive properties and the processes involved. The process involves the release of copper ions (electrically charged particles) when microbes, transferred by touching, sneezing or vomiting, land on the copper surface. The ions prevent cell respiration, punch holes in the bacterial cell membrane or disrupt the viral coat, and destroy the DNA and RNA inside.<p>&gt; This latter property is important as it means that no mutation can occur – preventing the microbe from developing resistance to copper.</text></comment> |
31,327,062 | 31,326,997 | 1 | 2 | 31,324,917 | train | <story><title>No Dislikes has officially ruined YouTube for me</title><text>Spoiler: rant.<p>I don&#x27;t know what happened exactly but I&#x27;m pretty sure it&#x27;s the lack of dislike stats, that now my suggestions and home page of youtube is filled, and I mean FILLEDDD!, with videos that have 4k stock clips, catchy title, but completely lacking in content. Misleading 100%. Not 1, not 2, but like 8&#x2F;10 videos are now garbage stock footage with bs commentary over nothing.<p>Example:<p>Nasa just discovered truth about solar system!?!?!?!<p>Science has progressed a lot in last 100 years....<p>So and so first discovered pluto in 1xxx<p>Mayans used to think balbala...<p>Some historians think....<p>Now scientist finally have answered....<p>New evidence (2014 research) shows there might be a planet ...<p>No explanation of study because you know it actually requires some comprehension...<p>Insert failed attempt at humor...<p>Leave a comment on your thoughts..<p>===========<p>Same script, like 8th grade essay you didn&#x27;t study for, but multiplied by 100x.<p>We knew it was gonna ruin youtube, people told youtube it was gonna ruin it, and now exactly that happened. Click baity videos with nice stock footage that is barely relevant and half assed &#x27;answers&#x27;.</text></story><parent_chain><item><author>miniwark</author><text>I have done it a lot for around a year, but i think than the block list is very limited, because after a will the same &quot;blocked&quot; channels come back again in the list. I have now stopped to use this fake feature. I am sure of it because i regularly block official news or music channels, but they appear again anyway after a wile.<p>It&#x27;s the same for the &quot;not interested&quot; feature. I have stop to tell them, when asked, why i am not interested (mainly because i have already see the video). The same already &quot;not interested&quot; videos, already viewed and already liked videos show up again after a wile anyway...</text></item><item><author>FartyMcFarter</author><text>There&#x27;s a &quot;don&#x27;t recommend channel&quot; option if you click the 3 dots button on a video.<p>I use this liberally for all sorts of reasons and it makes my recommendations much better.</text></item><item><author>philliphaydon</author><text>You&#x27;ve probably ruined my recommendations forever now. :(<p>I pretty much only stick to subscriber channels now except for stuff people send me. Since there&#x27;s no dislikes I just don&#x27;t browse the stuff available anymore.<p>YouTube is definitely worse off without the dislike counter.</text></item><item><author>alexb_</author><text>I&#x27;ve recently heard about this YouTube account named Roel Van De Paar. If you&#x27;ve looked up any error message on YouTube recently, you&#x27;ve probably run into him. Because of the &quot;no dislikes&quot; change, he&#x27;s everywhere now.<p>And by everywhere, I mean EVERYWHERE. The account has over 2 million videos. Approximately 0.2% of ALL videos on the ENTIRE platform of YouTube can be attributed to this account, and if you check now they&#x27;ve probably uploaded a few videos in the past couple minutes. The videos are generated crap, ripped from tech forums. Normally you wouldn&#x27;t see it anywhere due to dislikes being easy to spot, but now they pop up all the time in search results due to dislikes becoming a sort of &quot;hidden feature&quot;.<p>Hiding dislikes = less people press dislike (no feedback) = low quality videos are much harder to get rid of in search results. It&#x27;s really, really bad.</text></item></parent_chain><comment><author>rasz</author><text>Sounds familiar, my post from 7 months ago:<p>&quot;I would love to see someone look deep into Twitch recommendation system - last time I tested the thing they call &quot;Feedback&quot; is a rolling buffer and wont let you exclude more than ~100 things, adding more simply removed oldest entries and starts spamming you with things you already excluded in the past. This looked like performance optimization (less things to track per user).&quot;</text></comment> | <story><title>No Dislikes has officially ruined YouTube for me</title><text>Spoiler: rant.<p>I don&#x27;t know what happened exactly but I&#x27;m pretty sure it&#x27;s the lack of dislike stats, that now my suggestions and home page of youtube is filled, and I mean FILLEDDD!, with videos that have 4k stock clips, catchy title, but completely lacking in content. Misleading 100%. Not 1, not 2, but like 8&#x2F;10 videos are now garbage stock footage with bs commentary over nothing.<p>Example:<p>Nasa just discovered truth about solar system!?!?!?!<p>Science has progressed a lot in last 100 years....<p>So and so first discovered pluto in 1xxx<p>Mayans used to think balbala...<p>Some historians think....<p>Now scientist finally have answered....<p>New evidence (2014 research) shows there might be a planet ...<p>No explanation of study because you know it actually requires some comprehension...<p>Insert failed attempt at humor...<p>Leave a comment on your thoughts..<p>===========<p>Same script, like 8th grade essay you didn&#x27;t study for, but multiplied by 100x.<p>We knew it was gonna ruin youtube, people told youtube it was gonna ruin it, and now exactly that happened. Click baity videos with nice stock footage that is barely relevant and half assed &#x27;answers&#x27;.</text></story><parent_chain><item><author>miniwark</author><text>I have done it a lot for around a year, but i think than the block list is very limited, because after a will the same &quot;blocked&quot; channels come back again in the list. I have now stopped to use this fake feature. I am sure of it because i regularly block official news or music channels, but they appear again anyway after a wile.<p>It&#x27;s the same for the &quot;not interested&quot; feature. I have stop to tell them, when asked, why i am not interested (mainly because i have already see the video). The same already &quot;not interested&quot; videos, already viewed and already liked videos show up again after a wile anyway...</text></item><item><author>FartyMcFarter</author><text>There&#x27;s a &quot;don&#x27;t recommend channel&quot; option if you click the 3 dots button on a video.<p>I use this liberally for all sorts of reasons and it makes my recommendations much better.</text></item><item><author>philliphaydon</author><text>You&#x27;ve probably ruined my recommendations forever now. :(<p>I pretty much only stick to subscriber channels now except for stuff people send me. Since there&#x27;s no dislikes I just don&#x27;t browse the stuff available anymore.<p>YouTube is definitely worse off without the dislike counter.</text></item><item><author>alexb_</author><text>I&#x27;ve recently heard about this YouTube account named Roel Van De Paar. If you&#x27;ve looked up any error message on YouTube recently, you&#x27;ve probably run into him. Because of the &quot;no dislikes&quot; change, he&#x27;s everywhere now.<p>And by everywhere, I mean EVERYWHERE. The account has over 2 million videos. Approximately 0.2% of ALL videos on the ENTIRE platform of YouTube can be attributed to this account, and if you check now they&#x27;ve probably uploaded a few videos in the past couple minutes. The videos are generated crap, ripped from tech forums. Normally you wouldn&#x27;t see it anywhere due to dislikes being easy to spot, but now they pop up all the time in search results due to dislikes becoming a sort of &quot;hidden feature&quot;.<p>Hiding dislikes = less people press dislike (no feedback) = low quality videos are much harder to get rid of in search results. It&#x27;s really, really bad.</text></item></parent_chain><comment><author>smaryjerry</author><text>I am convinced the not interested button does absolutely nothing.</text></comment> |
27,235,950 | 27,235,966 | 1 | 3 | 27,235,531 | train | <story><title>France's 18-year-olds given €300 culture pass</title><url>https://www.bbc.com/news/world-europe-57198737</url></story><parent_chain><item><author>joshuaheard</author><text>I lived in France for several years, and I find it somewhat amusing how protective France is of their culture. They limit the number of foreign restaurant chains like McDonalds and Chipotle. They are protective of their language. The French dinner meal (&quot;repas&quot;) is a UNESCO World Heritage Cultural Landmark. This &quot;culture pass&quot; is consistent with the French people&#x27;s love of all things French.</text></item></parent_chain><comment><author>olivermarks</author><text>I think it is a good thing overall. I remember the furor when McDonalds opened on the Champs Elysee. Let&#x27;s be honest, identical globalist fast food plastic frontages ruin the uniqueness of places. Airports are the ultimate example, you can fly 1000&#x27;s of mile and get off a plane to an airport that is virtually identical to the one you left. This is culturally banal and slowly visually homogenizing the world. The result is insane amounts of tourists heading to the few unique places left in search of differentiation</text></comment> | <story><title>France's 18-year-olds given €300 culture pass</title><url>https://www.bbc.com/news/world-europe-57198737</url></story><parent_chain><item><author>joshuaheard</author><text>I lived in France for several years, and I find it somewhat amusing how protective France is of their culture. They limit the number of foreign restaurant chains like McDonalds and Chipotle. They are protective of their language. The French dinner meal (&quot;repas&quot;) is a UNESCO World Heritage Cultural Landmark. This &quot;culture pass&quot; is consistent with the French people&#x27;s love of all things French.</text></item></parent_chain><comment><author>guggle</author><text>All good things if you ask me.<p>Also, this measure is not really about french culture. It is about culture in general. If you visit an art museum most of time it&#x27;s filled with art from everywhere. The quai Branly museum comes to my mind here, but really most museum have exhibits of foreign works.</text></comment> |
20,576,089 | 20,575,810 | 1 | 2 | 20,575,502 | train | <story><title>Graph database reinvented: Dgraph gets $11.5M to pursue unique, opinionated path</title><url>https://www.zdnet.com/article/you-can-go-your-own-graph-database-way-dgraph-secures-115m-to-pursue-its-opinionated-path/</url></story><parent_chain></parent_chain><comment><author>motohagiography</author><text>Congrats on funding round, we need more options in this space. I&#x27;m a Neo user today, and I&#x27;ve said before that while I don&#x27;t think most people will switch to graph databases, at some point I do think most new projects will use them. I chose Neo precisely because of py2neo.<p>The decision to adapt GraphQL instead of supporting Cypher, what would you say were the big trade offs?<p>As a startup, your price point is in the 20k&#x2F;y ballpark, but GrapheneDB&#x2F;Neo comes in a lot lower. What would I get for paying that much more?<p>I find that the audience for something that operates at the level of abstraction that a graph does tends to be different than the user who makes decisions based on lower-level features. e.g. design level problems vs. optimization problems.<p>Would you say more that you can get Mongo and Redis (+redisgraph) users to switch to Dgraph, or instead more that the main customers for graph databases are still in school right now and deciding what tech skills will underpin their careers?<p>Fascinating space and looking forward to trying the product.</text></comment> | <story><title>Graph database reinvented: Dgraph gets $11.5M to pursue unique, opinionated path</title><url>https://www.zdnet.com/article/you-can-go-your-own-graph-database-way-dgraph-secures-115m-to-pursue-its-opinionated-path/</url></story><parent_chain></parent_chain><comment><author>campoy</author><text>Hi there, I lead product at Dgraph and I&#x27;ll be happy to answer any questions you might have.<p>We&#x27;re very happy we&#x27;re finally able to share our fundraise, and this is just the beginning of many more features and improvements!<p>PS: we&#x27;re hiring ;) <a href="https:&#x2F;&#x2F;dgraph.io&#x2F;careers" rel="nofollow">https:&#x2F;&#x2F;dgraph.io&#x2F;careers</a></text></comment> |
14,498,070 | 14,498,069 | 1 | 3 | 14,497,237 | train | <story><title>How some people stay motivated at work when they don’t love their jobs</title><url>https://qz.com/999209/how-some-people-stay-motivated-and-energized-at-work-even-when-they-dont-love-their-jobs/</url></story><parent_chain><item><author>creepydata</author><text>I&#x27;m absolutely dumbfounded this is some sort of revelation to the author. It must be a new thing? Perhaps a product of the &quot;self esteem&#x2F;entitled generation?&quot; Whenever I hear people (including job ads) talk about &quot;passion&quot; at work I have to roll my eyes. I <i>never</i> though of work as a means to some sort of personal fulfillment, it&#x27;s a job, it&#x27;s purpose is to provide you money. That is it! If you get personal satisfaction out of your work that&#x27;s a nice bonus. Those &quot;passion&quot; people are going to become very dissatisfied with work and life - there&#x27;s very, very few jobs that have the ability to provide that. We should probably be looking at things in a more practical manner to avoid this.<p>I always thought jobs advertising that they are looking for someone with &quot;passion&quot; as a ploy to trick the bright eyed and naïve young into overworking before they become disillusioned.<p>I&#x27;ve actually enjoyed most of the &quot;menial&quot; jobs I&#x27;ve worked. The only unenjoyable job I worked was only unenjoyable due to piss poor management. Now I enjoy working as a software engineer, I like what I do, but it&#x27;s still just a job, just a way to earn a living.<p>I&#x27;m reminded of Office Space on career advice: &quot;[The question from a guidance counselor of what you would do if you had a million dollars] is bullshit to begin with. If everyone listened to her there would be no janitors because because nobody would clean up shit if they had a million dollars.&quot;</text></item></parent_chain><comment><author>test1235</author><text>The most fun job I ever had was at a shitty nightclub over summers in-between uni.<p>The pay was crap, and hours were 9pm to 5am, but I loved the people I worked with, and the physical&#x2F;manual labour was more satisfying than cranking out some requirements for an app noone cares about.<p>Back in my single days, I actually applied to do some warehouse work outside of my 9-5 programming day job &#x27;cos I found it so boring and unsatisfying. (Turns out you&#x27;re not allowed to work so many hours for health and safety reasons)<p>I don&#x27;t mind coding for a living - it&#x27;s easy: I sit at a desk and browse the internet for most part, but it&#x27;s hardly the peak of job satisfaction. I think if I was mad rich, I wouldn&#x27;t mind doing something poorly paid but outdoorsy and invigorating.</text></comment> | <story><title>How some people stay motivated at work when they don’t love their jobs</title><url>https://qz.com/999209/how-some-people-stay-motivated-and-energized-at-work-even-when-they-dont-love-their-jobs/</url></story><parent_chain><item><author>creepydata</author><text>I&#x27;m absolutely dumbfounded this is some sort of revelation to the author. It must be a new thing? Perhaps a product of the &quot;self esteem&#x2F;entitled generation?&quot; Whenever I hear people (including job ads) talk about &quot;passion&quot; at work I have to roll my eyes. I <i>never</i> though of work as a means to some sort of personal fulfillment, it&#x27;s a job, it&#x27;s purpose is to provide you money. That is it! If you get personal satisfaction out of your work that&#x27;s a nice bonus. Those &quot;passion&quot; people are going to become very dissatisfied with work and life - there&#x27;s very, very few jobs that have the ability to provide that. We should probably be looking at things in a more practical manner to avoid this.<p>I always thought jobs advertising that they are looking for someone with &quot;passion&quot; as a ploy to trick the bright eyed and naïve young into overworking before they become disillusioned.<p>I&#x27;ve actually enjoyed most of the &quot;menial&quot; jobs I&#x27;ve worked. The only unenjoyable job I worked was only unenjoyable due to piss poor management. Now I enjoy working as a software engineer, I like what I do, but it&#x27;s still just a job, just a way to earn a living.<p>I&#x27;m reminded of Office Space on career advice: &quot;[The question from a guidance counselor of what you would do if you had a million dollars] is bullshit to begin with. If everyone listened to her there would be no janitors because because nobody would clean up shit if they had a million dollars.&quot;</text></item></parent_chain><comment><author>k-mcgrady</author><text>&gt;&gt; Those &quot;passion&quot; people are going to become very dissatisfied with work and life - there&#x27;s very, very few jobs that have the ability to provide that.<p>Maybe that&#x27;s a good thing. If there&#x27;s enough of them the entire concept will have to be reexamined (UBI for example or digital nomads, etc.). I think a lot more people are just starting to realise that owning a house and a car and having a few kids and a couple of ex-wives and a mountain of &#x27;stuff&#x27; isn&#x27;t necessary.<p>There are lots of jobs that will provide fulfilment. Most of them, of course, won&#x27;t pay anywhere as near as well as your average office work - they pay enough though for someone who doesn&#x27;t want the things I mentioned above.<p>Edit: Claiming having lots of money and &#x27;stuff&#x27; isn&#x27;t important apparently gets you lots of down votes now...strange.</text></comment> |
30,859,400 | 30,857,704 | 1 | 2 | 30,855,419 | train | <story><title>Your computer is a distributed system</title><url>http://catern.com/compdist.html</url></story><parent_chain><item><author>wwalexander</author><text>Plan 9 was designed in this way, but never took off.<p>Rob Pike:<p>&gt; This is 2012 and we&#x27;re still stitching together little microcomputers with HTTPS and ssh and calling it revolutionary. I sorely miss the unified system view of the world we had at Bell Labs, and the way things are going that seems unlikely to come back any time soon.</text></item><item><author>throwaway787544</author><text>The thing we are missing still is the distributed OS. Kubernetes only exists because of the missing abstractions in Linux to be able to do computation, discovery, message passing&#x2F;IO, instrumentation over multiple nodes. If you could do <i>ps -A</i> and see all processes on all nodes, or run a program and have it automatically execute on a random node, or if (<i>grumble grumble</i>) Systemd unit files would schedule a minimum of X processes on N nodes, most of the K8s ecosystem would become redundant. A lot of other components like unified AuthZ for linux already exist, as well as networking (WireGuard anyone?).</text></item></parent_chain><comment><author>jasonwatkinspdx</author><text>I think Rob is right to call out the problem, but is being a bit rose colored about Plan 9.<p>Plan 9 was definitely ahead of its time, but it&#x27;s also a far cry from the sort of distributed OS we need today. &quot;Everything is a remote posix file&quot; ends up being a really bad abstraction for distributed computing. What people are doing today with warehouse scale clusters indeed has a ton of layers of crap in there, and I think it&#x27;s obvious to yern for sweeping that away. But there&#x27;s no chance you could do that with P9 as it was designed.</text></comment> | <story><title>Your computer is a distributed system</title><url>http://catern.com/compdist.html</url></story><parent_chain><item><author>wwalexander</author><text>Plan 9 was designed in this way, but never took off.<p>Rob Pike:<p>&gt; This is 2012 and we&#x27;re still stitching together little microcomputers with HTTPS and ssh and calling it revolutionary. I sorely miss the unified system view of the world we had at Bell Labs, and the way things are going that seems unlikely to come back any time soon.</text></item><item><author>throwaway787544</author><text>The thing we are missing still is the distributed OS. Kubernetes only exists because of the missing abstractions in Linux to be able to do computation, discovery, message passing&#x2F;IO, instrumentation over multiple nodes. If you could do <i>ps -A</i> and see all processes on all nodes, or run a program and have it automatically execute on a random node, or if (<i>grumble grumble</i>) Systemd unit files would schedule a minimum of X processes on N nodes, most of the K8s ecosystem would become redundant. A lot of other components like unified AuthZ for linux already exist, as well as networking (WireGuard anyone?).</text></item></parent_chain><comment><author>jlpom</author><text>This describe more a Single System Image [0] to me (WPD includes Plan 9 as one but considering it does not does not supports process migration I find it moot).
LinuxPMI [1] seems to be a good idea but they seems to be based on Linux 2.6, so you would have to heavily patch newer kernel.
The only thing that seems to support process migration with current software &#x2F; still active are CRIU [2] (which doesn&#x27;t support graphical&#x2F;wayland programs) and DragonflyBSD [3] (in their own words very basic).<p>[0]: <a href="https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Single_system_image" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Single_system_image</a>
[1]: <a href="http:&#x2F;&#x2F;linuxpmi.org" rel="nofollow">http:&#x2F;&#x2F;linuxpmi.org</a>
[2]: criu.org
[3]: <a href="https:&#x2F;&#x2F;man.dragonflybsd.org&#x2F;?command=sys_checkpoint&amp;section=2" rel="nofollow">https:&#x2F;&#x2F;man.dragonflybsd.org&#x2F;?command=sys_checkpoint&amp;section...</a></text></comment> |
7,987,391 | 7,987,156 | 1 | 2 | 7,986,764 | train | <story><title>Stop The JerkTech</title><url>http://techcrunch.com/2014/07/03/go-disrupt-yourself/</url></story><parent_chain><item><author>eevilspock</author><text><i>&quot;It is difficult to get a man to understand something when his salary depends upon his not understanding it.&quot;</i><p>Because nearly every internet business gets it&#x27;s revenue from advertising, a lot of good people jump through hoops to convince themselves that they have not made a deal with the devil. The downvotes[1] come fast when I point out this moral and cognitive dissonance, usually without any counter argument since they just want me to go away.<p>When I talk to anyone who&#x27;s livelihood doesn&#x27;t rely on advertising or marketing, there is near unanimous agreement that advertising is fundamentally manipulative and dishonest.<p>Besides the social cost of Advertising, there many other costs[2]. The idea that advertising gives us free web content and services is utter bullshit.<p>-<p>[1] The cowardly downvotes with no counter argument have already started. Upton Sinclair is the actual source of the quote at the top. A wise man who got a bunch of downvotes in his time.<p>[2] <a href="https://news.ycombinator.com/item?id=7767811" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=7767811</a></text></item><item><author>wpietri</author><text>I mentor at startup events, so I know exactly what kind of people they are.<p>The good ones are people who just haven&#x27;t thought it through. They&#x27;ve found some sort of demand, and they&#x27;re doing the standard hack-the-system thing that the startup culture trains them to do. Once it&#x27;s rolling, it&#x27;s like Mencken said: &quot;It is difficult to get a man to understand something when his salary depends upon his not understanding it.&quot;<p>The bad ones are the sort of entitled douchebags that would have previously gone into investment banking or some other industry where the money&#x27;s great and ethics aren&#x27;t emphasized. But now startups are the high-status, high-money thing, so that&#x27;s what they&#x27;re after.</text></item><item><author>bunkat</author><text>Whenever I hear about these I always wonder what the people that found them are thinking. I mean, what type of person would you have to be to decide to try and screw over the restaurant industry (which already has historically low margins) just to make a buck? I can&#x27;t even imagine how many fake reservations they would need to create to make that idea worthwhile for them. And what investors would give them money to do so?</text></item></parent_chain><comment><author>jamesisaac</author><text>As someone who worked in an advertising-industry startup for 2 years before questioning what I was doing, I agree with everything you&#x27;ve said (in your linked post too), and sorry to see you&#x27;re getting downvoted.<p>The project I subsequently started operates on a very simple $5&#x2F;mo pausable subscription with a 30-day free trial. I&#x27;m _much_ happier with this model than anything I&#x27;ve ever worked on which is ad-based - it feels like a completely fair and fulfilling exchange of value.<p>I, too, feel like if the web was free from advertising, and instead was home to services supported by paying customers and&#x2F;or donations, it would be a much better place and have far-reaching benefits. This is something I genuinely want to make a difference on, so feel free to get in touch if you want to see if we can combine our efforts in some way (contact details in profile).</text></comment> | <story><title>Stop The JerkTech</title><url>http://techcrunch.com/2014/07/03/go-disrupt-yourself/</url></story><parent_chain><item><author>eevilspock</author><text><i>&quot;It is difficult to get a man to understand something when his salary depends upon his not understanding it.&quot;</i><p>Because nearly every internet business gets it&#x27;s revenue from advertising, a lot of good people jump through hoops to convince themselves that they have not made a deal with the devil. The downvotes[1] come fast when I point out this moral and cognitive dissonance, usually without any counter argument since they just want me to go away.<p>When I talk to anyone who&#x27;s livelihood doesn&#x27;t rely on advertising or marketing, there is near unanimous agreement that advertising is fundamentally manipulative and dishonest.<p>Besides the social cost of Advertising, there many other costs[2]. The idea that advertising gives us free web content and services is utter bullshit.<p>-<p>[1] The cowardly downvotes with no counter argument have already started. Upton Sinclair is the actual source of the quote at the top. A wise man who got a bunch of downvotes in his time.<p>[2] <a href="https://news.ycombinator.com/item?id=7767811" rel="nofollow">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=7767811</a></text></item><item><author>wpietri</author><text>I mentor at startup events, so I know exactly what kind of people they are.<p>The good ones are people who just haven&#x27;t thought it through. They&#x27;ve found some sort of demand, and they&#x27;re doing the standard hack-the-system thing that the startup culture trains them to do. Once it&#x27;s rolling, it&#x27;s like Mencken said: &quot;It is difficult to get a man to understand something when his salary depends upon his not understanding it.&quot;<p>The bad ones are the sort of entitled douchebags that would have previously gone into investment banking or some other industry where the money&#x27;s great and ethics aren&#x27;t emphasized. But now startups are the high-status, high-money thing, so that&#x27;s what they&#x27;re after.</text></item><item><author>bunkat</author><text>Whenever I hear about these I always wonder what the people that found them are thinking. I mean, what type of person would you have to be to decide to try and screw over the restaurant industry (which already has historically low margins) just to make a buck? I can&#x27;t even imagine how many fake reservations they would need to create to make that idea worthwhile for them. And what investors would give them money to do so?</text></item></parent_chain><comment><author>profquail</author><text>&gt; &quot;Because nearly every internet business gets it&#x27;s revenue from advertising&quot;<p>Citation please? There are many &quot;internet businesses&quot; that don&#x27;t derive their revenue from advertising, e.g., AWS, Dropbox, Netflix. Spotify derives some revenue from advertising, but their core business is subscriptions. <i>ad nauseum</i></text></comment> |
11,028,129 | 11,028,266 | 1 | 2 | 11,027,203 | train | <story><title>State law: You can't say bad things about a business</title><url>http://www.nytimes.com/2016/02/01/opinion/no-more-exposes-in-north-carolina.html?_r=0</url></story><parent_chain><item><author>Someone1234</author><text>I definitely agree that this law violates the First Amendment and therefore suspect if a case came to the SCOTUS they would rule against it (assuming it even needed to get that high).<p>But my question is this: From my understanding, the First Amendment establishes limits on what government can and cannot do, it doesn&#x27;t establish what a private entity can and cannot do. In this case the law allows a private business to sue a private individual, and while the law itself is established by government, the use of that law is not.<p>So maybe someone with more legal expertise than myself can answer: Is it possible to bring this issue before SCOTUS if it is only used by non-government? Have they effectively found a loophole by keeping government out of the actual lawsuits? Can you sue a law itself (would you have standing)?</text></item><item><author>otterley</author><text>I imagine these laws eventually will be overturned by the Supreme Court as contrary to the First Amendment. It involves punishment of core political speech, so it should be a relatively open-and-shut case.<p>The strongest argument in favor of the Constitutionality of these laws is that they are outlawing speech that could not have been made but for the violation of another law (i.e. common trespass and&#x2F;or contract breach), but I&#x27;d be very surprised if the laws survived on those grounds. Newsgathering is often performed by trespassing, even newsgathering that&#x27;s been previously upheld by the Court (e.g. Pentagon Papers, in which the author had unlawful access to classified materials).</text></item></parent_chain><comment><author>matthewmcg</author><text>The First Amendment is still implicated even in private disputes when one party requests that a court take some action that restricts the speech of the other party. The court can&#x27;t take the requested action (e.g. an injunction stopping publication) if it violates the first amendment.<p>This is a bit of a simplification but the name in legal circles for the concept you are talking about is the &quot;state action&quot; requirement.</text></comment> | <story><title>State law: You can't say bad things about a business</title><url>http://www.nytimes.com/2016/02/01/opinion/no-more-exposes-in-north-carolina.html?_r=0</url></story><parent_chain><item><author>Someone1234</author><text>I definitely agree that this law violates the First Amendment and therefore suspect if a case came to the SCOTUS they would rule against it (assuming it even needed to get that high).<p>But my question is this: From my understanding, the First Amendment establishes limits on what government can and cannot do, it doesn&#x27;t establish what a private entity can and cannot do. In this case the law allows a private business to sue a private individual, and while the law itself is established by government, the use of that law is not.<p>So maybe someone with more legal expertise than myself can answer: Is it possible to bring this issue before SCOTUS if it is only used by non-government? Have they effectively found a loophole by keeping government out of the actual lawsuits? Can you sue a law itself (would you have standing)?</text></item><item><author>otterley</author><text>I imagine these laws eventually will be overturned by the Supreme Court as contrary to the First Amendment. It involves punishment of core political speech, so it should be a relatively open-and-shut case.<p>The strongest argument in favor of the Constitutionality of these laws is that they are outlawing speech that could not have been made but for the violation of another law (i.e. common trespass and&#x2F;or contract breach), but I&#x27;d be very surprised if the laws survived on those grounds. Newsgathering is often performed by trespassing, even newsgathering that&#x27;s been previously upheld by the Court (e.g. Pentagon Papers, in which the author had unlawful access to classified materials).</text></item></parent_chain><comment><author>cmurf</author><text>It&#x27;s the worst example of government where businesses have more of a voice than people, and use their political influence to compel the government to basically act like a gang or the mafia to shut up dissenters.<p>It&#x27;s really a tiny handful of jerks who say, &quot;yes, it&#x27;s a good thing that we bash little baby pigs heads in, by grabbing them by the hind legs, and throwing them head first into the ground, when we determine they are non-viable. But we desperately don&#x27;t want the public to know that&#x27;s how we get rid of non-viable pigs, because we know the public actually has something of a moral conscience, unlike us, and would stop eating our pigs if they knew this is how it worked. And that&#x27;s just the beginning.&quot;<p>So the question is whether it&#x27;s the private business that&#x27;s abridging speech, or if it&#x27;s the government. If this an extension of defamation law, then it&#x27;s the former. If it&#x27;s asking the government to do their own dirty work, then it&#x27;s the latter. And I think it&#x27;s the latter.</text></comment> |
15,508,262 | 15,507,586 | 1 | 2 | 15,506,703 | train | <story><title>How the Frightful Five Put Startups in a Lose-Lose Situation</title><url>https://www.nytimes.com/2017/10/18/technology/frightful-five-start-ups.html</url></story><parent_chain><item><author>pyrale</author><text>People seem to think that Microsoft is a company that at some point sucked at innovation. In fact, the reason Microsoft let a generation of startups blossom was mostly because they got hit by antitrust.<p>Google, FB and Amazon are probably ripe for antitrust, too, but the government won&#x27;t move.</text></item></parent_chain><comment><author>TheRealDunkirk</author><text>Because the rest of them watched and learned from Microsoft&#x27;s mistake. When the antitrust campaign started, Microsoft was giving a paltry $10K to political campaigns. By the time it ended, they were giving $1M a year to BOTH sides, and wound up with a hand-slap. The rest have cozied up to the federal government since the start, and have no fear of antitrust efforts. What they really need now is to be categorized as public services so that they can enjoy some of that sweet, sweet regulatory capture that Verizon, AT&amp;T, and Comcast benefit from.</text></comment> | <story><title>How the Frightful Five Put Startups in a Lose-Lose Situation</title><url>https://www.nytimes.com/2017/10/18/technology/frightful-five-start-ups.html</url></story><parent_chain><item><author>pyrale</author><text>People seem to think that Microsoft is a company that at some point sucked at innovation. In fact, the reason Microsoft let a generation of startups blossom was mostly because they got hit by antitrust.<p>Google, FB and Amazon are probably ripe for antitrust, too, but the government won&#x27;t move.</text></item></parent_chain><comment><author>empath75</author><text>It&#x27;s hard to go after the FAANG companies for anti-trust when they all compete with each other in so many ways -- facebook and google have multiple competing services, apple, google and amazon all build consumer hardware, and facebook is dipping it&#x27;s toes in, they all have music services.. Netflix has tons of competitors outside of the big five, but amazon and google both have streaming movie services. Amazon and Google both have cloud services.<p>Really the only two that have a genuine monopoly are Google in search and Facebook in social networks, and I think perhaps Amazon in online retail.</text></comment> |
29,356,111 | 29,355,374 | 1 | 3 | 29,353,076 | train | <story><title>The internet is held together with spit and baling wire</title><url>https://krebsonsecurity.com/2021/11/the-internet-is-held-together-with-spit-baling-wire/</url></story><parent_chain><item><author>toast0</author><text>Back in the 90s when I got on, so much of traffic was exchanged at MAE-West or MAE-East, and a backhoe in Iowa could make nearly all the cross-US traffic go through Europe and Asia instead.<p>These days, there are lively public internet exchanges up and down both coasts, in texas and chicago and elsewhere. A well placed backhoe can still make a big mess, many &#x27;redundant fibers&#x27; are in the same strand, and last mile is fragile, but if my ISP network is up to the local internet exchange, there are many distinct routes to the other coast and a fiber cut is unlikely to route my traffic around the world.</text></item><item><author>fredophile</author><text>&quot;The internet routes around failure&quot; hasn&#x27;t been true for a long time. It refers to the original topography which has been replaced with a hub and spoke model. Remove a few hubs and you have disabled a large portion of the internet.</text></item><item><author>h2odragon</author><text>It&#x27;s Anti-Fragile. If it breaks all the time, everybody is highly experienced at patching together new workarounds, mechanisms for fail over are in place and regularly tested, and there&#x27;s whole classes of corner case bugs that get flushed out to be stomped (or nurtured as cherished pets) instead of breeding in the dark and jumping out at you all at once.<p>How can the &quot;Internet routes around failure&quot; be trusted without testing? Everything needs regular exercise or it atrophies.</text></item></parent_chain><comment><author>fragmede</author><text>I wonder if I can nerdsnipe anyone into figuring out the fewest BGP hijacks it would take to force traffic on the Internet from the West Coast of the US to take the long way around to the East Coast of the US. Never mind the latency, that pipe isn&#x27;t big enough for all of that traffic, so it&#x27;ll effectively be a netsplit.<p>I bet it&#x27;s a lower number than anyone&#x27;s actually comfortable with, though it would be rather difficult to pull of.</text></comment> | <story><title>The internet is held together with spit and baling wire</title><url>https://krebsonsecurity.com/2021/11/the-internet-is-held-together-with-spit-baling-wire/</url></story><parent_chain><item><author>toast0</author><text>Back in the 90s when I got on, so much of traffic was exchanged at MAE-West or MAE-East, and a backhoe in Iowa could make nearly all the cross-US traffic go through Europe and Asia instead.<p>These days, there are lively public internet exchanges up and down both coasts, in texas and chicago and elsewhere. A well placed backhoe can still make a big mess, many &#x27;redundant fibers&#x27; are in the same strand, and last mile is fragile, but if my ISP network is up to the local internet exchange, there are many distinct routes to the other coast and a fiber cut is unlikely to route my traffic around the world.</text></item><item><author>fredophile</author><text>&quot;The internet routes around failure&quot; hasn&#x27;t been true for a long time. It refers to the original topography which has been replaced with a hub and spoke model. Remove a few hubs and you have disabled a large portion of the internet.</text></item><item><author>h2odragon</author><text>It&#x27;s Anti-Fragile. If it breaks all the time, everybody is highly experienced at patching together new workarounds, mechanisms for fail over are in place and regularly tested, and there&#x27;s whole classes of corner case bugs that get flushed out to be stomped (or nurtured as cherished pets) instead of breeding in the dark and jumping out at you all at once.<p>How can the &quot;Internet routes around failure&quot; be trusted without testing? Everything needs regular exercise or it atrophies.</text></item></parent_chain><comment><author>xattt</author><text>This happened with regular phone service as recent as 2017 in Canada (1).<p>(1) <a href="https:&#x2F;&#x2F;atlantic.ctvnews.ca&#x2F;mobile&#x2F;many-atlantic-canadians-lose-cellphone-internet-service-in-widespread-outage-1.3533539" rel="nofollow">https:&#x2F;&#x2F;atlantic.ctvnews.ca&#x2F;mobile&#x2F;many-atlantic-canadians-l...</a></text></comment> |
16,227,274 | 16,227,241 | 1 | 2 | 16,226,495 | train | <story><title>Fake celebrity porn is blowing up on Reddit, thanks to artificial intelligence</title><url>https://www.theverge.com/2018/1/24/16929148/fake-celebrity-porn-ai-deepfake-face-swapping-artificial-intelligence-reddit</url></story><parent_chain><item><author>bduerst</author><text>What&#x27;s to keep you from adding the key to a camera after falsely generating the media? Or using the key of a known camera to generate false footage?</text></item><item><author>BoiledCabbage</author><text>&gt; What do you mean &quot;authentication technology&quot;?<p>Cryptographic signatures. Ie every frame in a video is signed with a 512-bit key that states authoritatively what camera was the source of the video and when it was taken. In order to change any pixels in the video you&#x27;d break this key and need to resign it. An attacker would be unable to do this unless they had physical access to the original camera.<p>But it&#x27;ll be at least 3 decades before this technology is commercialized, people see the demand for it, and the majority of all cameras in the world are replaced by it.
Even if this new tech is on the market in a decade (simple tech, no demand &#x2F; ecosystem yet), but 90% of existing &#x2F; installed cameras don&#x27;t have the feature then fake videos still get created with them. Only once ~80% of videos are authenticated, and a significant portion of the remaining 20% are fakes will people be able to dismiss non authenticated video. Up until then it&#x27;s fakes non-stop.<p>We&#x27;ve got some ugly decades coming up.</text></item><item><author>unethical_ban</author><text>What do you mean &quot;authentication technology&quot;? Tamper detection? The ability to see that a tool was used? It may slow things down, but this is an arms race.</text></item><item><author>Chaebixi</author><text>I wouldn&#x27;t be that pessimistic just yet. It&#x27;s yet possible that new advances in authentication technology might counteract some of these trends.</text></item><item><author>hirundo</author><text>Technology is degrading the value of photo and video evidence (and probably audio too) asymptotically toward that of famously unreliable testimony from memory. Criminality becomes less risky and&#x2F;or innocence becomes less protective. Law becomes less effective. A bad result, to the extent that the law isn&#x27;t an ass.<p>On the plus side artistic tools that help materialize internal life become more effective. We can interact with our dreams and fantasies more readily, to potentially therapeutic benefit.<p>It&#x27;s hard to say whether this trend holds more danger or promise.</text></item></parent_chain><comment><author>BoiledCabbage</author><text>So spitballing, my assumption is that it&#x27;d end up looking something like SSL certificate chains today. In the situation you mentioned:<p>1. The attacker wouldn&#x27;t have access to the original cameras private key.
2. The attacker creates a fake video, creates a private key and signs the video with the key
3. The attacker tries to install the key into a camera<p>Step 3 has to be made impossible. Meaning that a camera becomes a trusted entity and only allowed parties (ex the camera manufacturer) has the authority to insert a private key into a camera. This would be that after installing a camera with a new private key, the key would then need to be signed with the manufacturer&#x27;s private key and also stored in the camera. This shows the manufacturer is responsible for the contents of the video. If someone tries to change the camera&#x27;s private key it would no longer match what the manufacturer signed.<p>Which yes then means we need to have authorized camera manufactures and a process for certificate revocation and all of that.<p>The exact opposite of &quot;free and open&quot; for recording devices - and the only way we aren&#x27;t flooded with fake videos flooding and seriously impacting society. We have to want this if we want to still have a concept of video evidence in either the justice or social spheres.</text></comment> | <story><title>Fake celebrity porn is blowing up on Reddit, thanks to artificial intelligence</title><url>https://www.theverge.com/2018/1/24/16929148/fake-celebrity-porn-ai-deepfake-face-swapping-artificial-intelligence-reddit</url></story><parent_chain><item><author>bduerst</author><text>What&#x27;s to keep you from adding the key to a camera after falsely generating the media? Or using the key of a known camera to generate false footage?</text></item><item><author>BoiledCabbage</author><text>&gt; What do you mean &quot;authentication technology&quot;?<p>Cryptographic signatures. Ie every frame in a video is signed with a 512-bit key that states authoritatively what camera was the source of the video and when it was taken. In order to change any pixels in the video you&#x27;d break this key and need to resign it. An attacker would be unable to do this unless they had physical access to the original camera.<p>But it&#x27;ll be at least 3 decades before this technology is commercialized, people see the demand for it, and the majority of all cameras in the world are replaced by it.
Even if this new tech is on the market in a decade (simple tech, no demand &#x2F; ecosystem yet), but 90% of existing &#x2F; installed cameras don&#x27;t have the feature then fake videos still get created with them. Only once ~80% of videos are authenticated, and a significant portion of the remaining 20% are fakes will people be able to dismiss non authenticated video. Up until then it&#x27;s fakes non-stop.<p>We&#x27;ve got some ugly decades coming up.</text></item><item><author>unethical_ban</author><text>What do you mean &quot;authentication technology&quot;? Tamper detection? The ability to see that a tool was used? It may slow things down, but this is an arms race.</text></item><item><author>Chaebixi</author><text>I wouldn&#x27;t be that pessimistic just yet. It&#x27;s yet possible that new advances in authentication technology might counteract some of these trends.</text></item><item><author>hirundo</author><text>Technology is degrading the value of photo and video evidence (and probably audio too) asymptotically toward that of famously unreliable testimony from memory. Criminality becomes less risky and&#x2F;or innocence becomes less protective. Law becomes less effective. A bad result, to the extent that the law isn&#x27;t an ass.<p>On the plus side artistic tools that help materialize internal life become more effective. We can interact with our dreams and fantasies more readily, to potentially therapeutic benefit.<p>It&#x27;s hard to say whether this trend holds more danger or promise.</text></item></parent_chain><comment><author>andrewstuart2</author><text>Require that videos submitted as evidence use keypairs which were provably registered or published prior to the event.<p>Don&#x27;t have a registered key? Great, but that&#x27;s no longer admissible.<p>Still, problems would exist with letting the camera read the private key for signing, but not allowing an attacker (forger) to access the private key and thus sign their modified footage.<p>Just a thought.</text></comment> |
15,107,668 | 15,107,553 | 1 | 3 | 15,107,145 | train | <story><title>Bitcoin Energy Consumption Index</title><url>https://digiconomist.net/bitcoin-energy-consumption</url></story><parent_chain><item><author>amelius</author><text>&gt; Electricity consumed per transaction: 165 kWh<p>Thats about $20 in the US.<p>If a transaction is <i>that</i> expensive, how can this system even work for transactions with a value of that order? Are large transactions &quot;sponsoring&quot; small transactions?<p>What happens if more people use the system for small transactions? Will BTC become unusable?</text></item></parent_chain><comment><author>rothbardrand</author><text>Yeah, that&#x27;s inaccurate. The major portion of that energy expenditure is keeping the bitcoins that aren&#x27;t moving secure.<p>Put another way, this is saying it is estimated to take about $800M it keep $40B of bitcoin secure.<p>This is probably a bit expensive in terms of comparing to fiat currencies (But then we are probably not covering all the real costs of those currencies)... but bitcoin is still in the technology adoption life-cycle and has not yet become a currency.<p>When the inflation of BTC has declined and it has become fully adopted (assuming this happens) then the numbers will likely start to make a lot more sense.<p>Also, the vast majority of transactions in about 2 years will likely be happening off chain using lightening network and the like.<p>Those transaction fees will happily be covered by merchants who are currently paying %2-%5 per transaction to Visa et. al. and will be more than happy to pay %1 or less to Bitcoin to manage a lightening channel.<p>Really, it will be companies disrupting Visa building payment networks out of lightening channels and they will have to keep fees really low because LN is open source and anyone can compete with them.</text></comment> | <story><title>Bitcoin Energy Consumption Index</title><url>https://digiconomist.net/bitcoin-energy-consumption</url></story><parent_chain><item><author>amelius</author><text>&gt; Electricity consumed per transaction: 165 kWh<p>Thats about $20 in the US.<p>If a transaction is <i>that</i> expensive, how can this system even work for transactions with a value of that order? Are large transactions &quot;sponsoring&quot; small transactions?<p>What happens if more people use the system for small transactions? Will BTC become unusable?</text></item></parent_chain><comment><author>amelius</author><text>So, if you look just at transaction cost the system clearly cannot sustain itself. But as more people buy into BTC, the system keeps from collapsing. If you ask me, that&#x27;s starting to look like a Ponzi scheme in disguise ...</text></comment> |
19,490,594 | 19,490,372 | 1 | 2 | 19,488,642 | train | <story><title>Handsontable drops open source for a non-commercial license</title><url>https://github.com/handsontable/handsontable/issues/5831</url></story><parent_chain><item><author>jasonkester</author><text>Great to hear.<p>It&#x27;s painful to watch so many good developers giving away their best work for free, because they feel it&#x27;s somehow their duty to do so. It&#x27;s the same feeling I get watching fresh graduates working 100 hour weeks for these startups because there&#x27;s free food and everybody else is doing it. You want to help, but the culture is just so well geared towards attracting new kids to exploit and convincing them that the exploitation is a good thing.<p>It&#x27;s like a cult. Except that nearly every software developer in the world is in on it.<p>I&#x27;m certainly not helping the situation personally, and will happily use whatever Open Source software product helps my business for free. But I do hope that people come to their senses at some point and stop peer-pressuring each other into continuing to spend so much effort polishing good software only to give it away for free.<p>The things you build have value. They&#x27;re making other people billions of dollars. Charge accordingly.</text></item></parent_chain><comment><author>eitland</author><text>Well:<p>- as I mentioned before the fact that they were actual open source compared to some of their competitors heavily the decision of the team I was on at that time to go with their commercial package. That advantage is gone now.<p>- I have spent a few hours voluntarily helping in debugging and following up a couple of issues because it was open source and I was helping us and everyone else.<p>- They might have gotten of the hook a bit easier a couple of times (bugs, half a year of delays) than if they&#x27;d been commercial.<p>I don&#x27;t know, I thought many people here would do the same but I don&#x27;t know so I cannot say how much goodwill they are losing.</text></comment> | <story><title>Handsontable drops open source for a non-commercial license</title><url>https://github.com/handsontable/handsontable/issues/5831</url></story><parent_chain><item><author>jasonkester</author><text>Great to hear.<p>It&#x27;s painful to watch so many good developers giving away their best work for free, because they feel it&#x27;s somehow their duty to do so. It&#x27;s the same feeling I get watching fresh graduates working 100 hour weeks for these startups because there&#x27;s free food and everybody else is doing it. You want to help, but the culture is just so well geared towards attracting new kids to exploit and convincing them that the exploitation is a good thing.<p>It&#x27;s like a cult. Except that nearly every software developer in the world is in on it.<p>I&#x27;m certainly not helping the situation personally, and will happily use whatever Open Source software product helps my business for free. But I do hope that people come to their senses at some point and stop peer-pressuring each other into continuing to spend so much effort polishing good software only to give it away for free.<p>The things you build have value. They&#x27;re making other people billions of dollars. Charge accordingly.</text></item></parent_chain><comment><author>TAForObvReasons</author><text>This announcement is about Handsontable completely shutting down the open source offering and <i>using the same distribution channels to pump proprietary software</i>. Wheres a month ago <a href="https:&#x2F;&#x2F;github.com&#x2F;handsontable&#x2F;handsontable&#x2F;" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;handsontable&#x2F;handsontable&#x2F;</a> had open source code, now it is all proprietary. That&#x27;s not something to celebrate.<p>As pointed out in the github discussion, Elastic did the same thing with Elasticsearch, and that contributed to AWS forking the project: <a href="https:&#x2F;&#x2F;aws.amazon.com&#x2F;blogs&#x2F;opensource&#x2F;keeping-open-source-open-open-distro-for-elasticsearch&#x2F;" rel="nofollow">https:&#x2F;&#x2F;aws.amazon.com&#x2F;blogs&#x2F;opensource&#x2F;keeping-open-source-...</a><p>The same exact issue applies here:<p>&gt; The maintainers of open source projects have the responsibility of keeping the source distribution open to everyone and <i>not changing the rules midstream</i><p>&gt; we believe that maintainers of an open source project have a responsibility to ensure that <i>the primary open source distribution remains open and free of proprietary code</i><p>Those comments aren&#x27;t an indictment of the open core business model or of attempts to monetize open source, but rather a criticism of &quot;muddying the waters&quot; and trying to create licensing confusion.</text></comment> |
6,645,121 | 6,644,777 | 1 | 3 | 6,642,893 | train | <story><title>Why Meteor will kill Ruby on Rails</title><url>http://differential.io/blog/meteor-killin-rails</url></story><parent_chain><item><author>joshowens</author><text>Sure:<p><a href="http://www.shinglecentral.com/" rel="nofollow">http:&#x2F;&#x2F;www.shinglecentral.com&#x2F;</a>
<a href="http://assistant.io/" rel="nofollow">http:&#x2F;&#x2F;assistant.io&#x2F;</a>
<a href="http://lister.io/" rel="nofollow">http:&#x2F;&#x2F;lister.io&#x2F;</a><p>Are just a few from the last 1.5 months.</text></item><item><author>jarsbe</author><text>I&#x27;d love to see some examples of these apps, are you able to please share?</text></item><item><author>joshowens</author><text>This isn&#x27;t just fashion. We launch client apps in 6 weeks, time isn&#x27;t negligible for us.</text></item><item><author>bowlofpetunias</author><text>Why do people put so much effort in comparing tool A to tool B when either of those tools only cover 5% of all the work that goes into any serious application, and the time saved by any advantage tool A has over tool B is pretty much negligible?<p>I mean cool, so Meteor is maybe better for prototyping. Because that&#x27;s all we&#x27;re talking about here, prototypes and ultra-simple websites.<p>It&#x27;s always the same story, a shiny new tools that make the first weeks a little smoother, and after that it&#x27;s business as usually for entire life cycle of the application.<p>Except of course you now have to deal with a tool that still has years to go before it&#x27;s really mature and stable, and any advantage you gained in the first few weeks is completely lost.<p>This has nothing to do with software development, this is just about fashion.</text></item></parent_chain><comment><author>hippee-lee</author><text>Can you point to some example that are a bit more complex in nature? My issue (that I identified with the backbone tutorials and re affirmed with the Angular tutorials) is that simple apps are easy to do with x, y or z. But if the new one or that one is going to replace the old one there should be 37 signals level complexity type apps out there. While I can&#x27;t speak for Meteor, I don&#x27;t see anything running on angular that&#x27;s at that level.<p>Caveats: I glossed over the fact that angular is not a full stack, I have seen a few angular videos where teams have demoed and presented on complex angular apps.</text></comment> | <story><title>Why Meteor will kill Ruby on Rails</title><url>http://differential.io/blog/meteor-killin-rails</url></story><parent_chain><item><author>joshowens</author><text>Sure:<p><a href="http://www.shinglecentral.com/" rel="nofollow">http:&#x2F;&#x2F;www.shinglecentral.com&#x2F;</a>
<a href="http://assistant.io/" rel="nofollow">http:&#x2F;&#x2F;assistant.io&#x2F;</a>
<a href="http://lister.io/" rel="nofollow">http:&#x2F;&#x2F;lister.io&#x2F;</a><p>Are just a few from the last 1.5 months.</text></item><item><author>jarsbe</author><text>I&#x27;d love to see some examples of these apps, are you able to please share?</text></item><item><author>joshowens</author><text>This isn&#x27;t just fashion. We launch client apps in 6 weeks, time isn&#x27;t negligible for us.</text></item><item><author>bowlofpetunias</author><text>Why do people put so much effort in comparing tool A to tool B when either of those tools only cover 5% of all the work that goes into any serious application, and the time saved by any advantage tool A has over tool B is pretty much negligible?<p>I mean cool, so Meteor is maybe better for prototyping. Because that&#x27;s all we&#x27;re talking about here, prototypes and ultra-simple websites.<p>It&#x27;s always the same story, a shiny new tools that make the first weeks a little smoother, and after that it&#x27;s business as usually for entire life cycle of the application.<p>Except of course you now have to deal with a tool that still has years to go before it&#x27;s really mature and stable, and any advantage you gained in the first few weeks is completely lost.<p>This has nothing to do with software development, this is just about fashion.</text></item></parent_chain><comment><author>general_failure</author><text>I fail to see how meteor helps build these sites. These can be built by normal CMS can&#x27;t they?</text></comment> |
40,330,415 | 40,328,713 | 1 | 3 | 40,317,629 | train | <story><title>Why you can hear the temperature of water</title><url>https://www.nytimes.com/2024/05/09/science/hot-water-sound-cold.html</url></story><parent_chain><item><author>axxl</author><text>My thought on the general &quot;loudness&quot; of cold months was due to reduced noise blocking or absorbing greenery like tree leaves, grass, etc. Which is then altered by a significant snowfall leading to sounds being softened again.</text></item><item><author>Tanoc</author><text>This is something I noticed as a kid. We had a creek in our backyard, which,depending on the temperature the water would be louder or quieter. This annoyed our dog, which after a number of times caused me to notice. Running water is louder and much more sharp when it&#x27;s cold out, and quieter and muffled when it&#x27;s hot out. In the same way sounds are louder in a colder environment because there&#x27;s already a low level of ambient energy contained in the air and so the energy disperses much more readily but dissipates much more quickly. Essentially a difference between a quick &quot;crack&quot; and a lingering &quot;whump&quot; in terms of auditory impact. This effect also propagates to solid materials, as cold metals and ceramics transmit sound better than warm ceramics or metals. A church bell quite literally is louder on a cold winter&#x27;s day.</text></item></parent_chain><comment><author>soulofmischief</author><text>No one here has mentioned temperature inversion [0] which is responsible for a lot of the cold-induced amplification perceived in urban areas. It&#x27;s quite a fascinating effect.<p>[0] <a href="https:&#x2F;&#x2F;wisconsindot.gov&#x2F;Documents&#x2F;doing-bus&#x2F;eng-consultants&#x2F;cnslt-rsrces&#x2F;environment&#x2F;trafficnoiseweather.pdf" rel="nofollow">https:&#x2F;&#x2F;wisconsindot.gov&#x2F;Documents&#x2F;doing-bus&#x2F;eng-consultants...</a></text></comment> | <story><title>Why you can hear the temperature of water</title><url>https://www.nytimes.com/2024/05/09/science/hot-water-sound-cold.html</url></story><parent_chain><item><author>axxl</author><text>My thought on the general &quot;loudness&quot; of cold months was due to reduced noise blocking or absorbing greenery like tree leaves, grass, etc. Which is then altered by a significant snowfall leading to sounds being softened again.</text></item><item><author>Tanoc</author><text>This is something I noticed as a kid. We had a creek in our backyard, which,depending on the temperature the water would be louder or quieter. This annoyed our dog, which after a number of times caused me to notice. Running water is louder and much more sharp when it&#x27;s cold out, and quieter and muffled when it&#x27;s hot out. In the same way sounds are louder in a colder environment because there&#x27;s already a low level of ambient energy contained in the air and so the energy disperses much more readily but dissipates much more quickly. Essentially a difference between a quick &quot;crack&quot; and a lingering &quot;whump&quot; in terms of auditory impact. This effect also propagates to solid materials, as cold metals and ceramics transmit sound better than warm ceramics or metals. A church bell quite literally is louder on a cold winter&#x27;s day.</text></item></parent_chain><comment><author>xattt</author><text>The volume of water flow should&#x2F;would also vary at different times of the year, based on how much the water table is loaded.<p>Not sure how much of this would be complimentary to the acoustic effect of temperature. Either way, it’s not a simple single-solution explanation.</text></comment> |
3,502,685 | 3,502,198 | 1 | 2 | 3,501,980 | train | <story><title>Mozilla releases version 0.1 of the Rust programming language</title><url>http://mail.mozilla.org/pipermail/rust-dev/2012-January/001256.html</url></story><parent_chain></parent_chain><comment><author>haberman</author><text>As a die-hard C guy, Rust is the first "new systems programming language" since Cyclone and D that I didn't immediately dislike. A lot of really interesting ideas in here. I'd love to know what Mozilla uses this for internally.<p>That said, it's hard to imagine anything displacing C for me. Almost any systems code I write these days is something I'll eventually want to be able to expose to a high-level language (Lua, Python, Ruby, etc). To do that you need code that can integrate into another language's runtime, no matter what memory management or concurrency model it uses. When you're trying to do that, having lots of rich concepts and semantics in your systems language (closures, concurrency primitives, garbage collection) just gets in the way, because you have to find out how to map between the two. C's lack of features is actually a strength in this regard.<p>I do really like Rust's expressiveness though. The pattern matching is particularly nice, and that is an example of a features that wouldn't "get in the way" if you're trying to bind to higher-level languages.</text></comment> | <story><title>Mozilla releases version 0.1 of the Rust programming language</title><url>http://mail.mozilla.org/pipermail/rust-dev/2012-January/001256.html</url></story><parent_chain></parent_chain><comment><author>markerdmann</author><text>In case you're wondering why we need another programming language, this short audio clip from Brendan Eich is great:<p><a href="http://www.aminutewithbrendan.com/pages/20101206" rel="nofollow">http://www.aminutewithbrendan.com/pages/20101206</a><p>"With Rust, what Graydon has been trying to do is focus on safety and also on you could say concurrency -- the actor model which I've spoken about recently - and the two are related in subtle ways."</text></comment> |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.