text
stringlengths 44
776k
| meta
dict |
---|---|
New mobile version of NYTimes is awesome - uladzislau
http://mobile.nytimes.com
======
adamjernst
Agreed—mostly because it's SIMPLE!
| {
"pile_set_name": "HackerNews"
} |
The bizarre fish that evolved for oceans, but lives on land - narad
http://www.dnaindia.com/scitech/report_the-bizarre-fish-that-evolved-for-oceans-but-lives-on-land_1582177
======
zlapper
More about this fish (inc. picture)
<http://www.fishbase.org/summary/SpeciesSummary.php?id=56805>
| {
"pile_set_name": "HackerNews"
} |
2B phones cannot use Google and Apple contact-tracing tech - caution
https://arstechnica.com/tech-policy/2020/04/2-billion-phones-cannot-use-google-and-apple-contract-tracing-tech/
======
haspoken
And then there are those who do not have a phone....
| {
"pile_set_name": "HackerNews"
} |
Ask HN: Can someone translate Alexa into unique monthly visitors? - earbits
I found a scatter graph that correlated Alexa ranking to unique visitors but it was from 2002. Anyone know how to write and where to post a script where site owners can input this information and build a more current graph to reference?
======
proee
Find a site that shows their stats to the public (something with the same
magnitude that you're interested in) and then use that to get the relative
traffic of other sites you're interested in.
We've used this approach and it works well for getting approximate traffic
numbers for our competition.
| {
"pile_set_name": "HackerNews"
} |
Pure CSS Still Life – Water and Lemons - bennettfeely
https://codepen.io/ivorjetski/pen/xMJoYO
======
system2
Neat, any source how this can be achieved?
~~~
prawn
The source is provided right there in the CSS.
| {
"pile_set_name": "HackerNews"
} |
Ask HN: Do you know any reliable and unbiased news site? - pawanpe
Fed up with biased and fake news with all the big news channel, big online sites including google. I want to know if there are any news sites/channels that present then as is (and definitely donot want their opinion)
======
TheAsprngHacker
I think that all sources have biases, whether intentional or otherwise. You
mention "big news channel" as an example of "fake news;" alternative news
outlets can also be biased, but IMO the opinions can actually be a benefit so
that you understand viewpoints not expressed in the establishment. On a
related not, you could also read different (mainstream) news outlets from
across the aisle and compare the different presentations of information.
That being said, I think that Reuters a lot less biased than a lot of other
mainstream media. Although I couldn't find a source when a glimpsed at the
Reuters news site, Wikipedia claims that Reuters has a "value-neutral"
language policy:
[https://en.wikipedia.org/wiki/Reuters#Policy_of_objective_la...](https://en.wikipedia.org/wiki/Reuters#Policy_of_objective_language)
| {
"pile_set_name": "HackerNews"
} |
Free Book: Introduction to Computer Graphics - a_w
http://math.hws.edu/graphicsbook/index.html
======
sevensor
Why would you start with OpenGL 1.1 and then introduce shader programs in the
context of WebGL? Why would you teach people to use vertex3f at all?
| {
"pile_set_name": "HackerNews"
} |
Top 10 Programming Languages for 2014 - recheej
http://www.slideshare.net/lyndadotcom/top-10-programming-languages-to-know-in-2014?utm_campaign=post-Web&utm_content=2014-01-07-11-09-26&utm_medium=viral&utm_source=facebook
Lynda.com ranks top 10 languages to know for 2014.
======
jhuckabee
I'd venture to guess this presentation wasn't written by a very technical
person. ASP.NET is now considered a "programming language", and JavaScript is
a "close, interpreted relative" of Java. I don't think so.
------
dutchrapley
I think this is a list of languages to know due to popularity.
As far as languages to learn, I think Go, Python, and JavaScript should lead
the pack.
------
serichsen
Meh. This is not "ahead of the curve" but just the mainstream, maybe a little
behind.
| {
"pile_set_name": "HackerNews"
} |
The Cult Books That Lost Their Cool - pseudolus
http://www.bbc.com/culture/story/20190920-the-cult-books-that-lost-their-cool
======
nabla9
This review is written in the style of "Books that are not politically correct
anymore."
~~~
krasicki
Good summary. Nothing factual to see here, just another identity politic
screed about books this individual either disagrees with or has never read.
I'm not even sure about the premise of the article. Using the editorial "we",
the author projects that no self-respecting reader today could possibly
imagine why books written decades ago might have been popular. Strange.
| {
"pile_set_name": "HackerNews"
} |
California, Oregon and Washington announce western states pact - khartig
https://www.gov.ca.gov/2020/04/13/california-oregon-washington-announce-western-states-pact/
======
jcranmer
Also relevant is that the states of New York, New Jersey, Pennsylvania,
Connecticut, Rhode Island, and Delaware made a similar pact today.
Edit: Massachusetts has also joined this pact as well (I thought it was
surprising that it hadn't when I first read the news).
~~~
sandworm101
If only there was some sort of multi-state organization through which
individual states could pull together on issues spanning multiple states. Such
a "federal" approach would be very useful. If only someone had thought of this
years ago. Of course it would have to be staffed by people selected and
respected by all, but surely there is some way of doing that too.
~~~
baggy_trough
It would be especially good if that kind of organization had limited and
agreed-to powers that were written down in some sort of document, and then
those constraints were respected and not repeatedly gamed around.
~~~
pas
Surely any document is just as smart and wise as the people interpreting it.
(Are we back to square one or do we have to go through Internet fueled
partisan hyper-polarization and "big data" aided gerrymandering? Oh, yes, we
need to vacant a seat on the Council of the Interpreters and mysteriously
leave it empty and let the winner of the aforementioned social-media campaign
pick the next interpreter!)
------
DoreenMichele
As someone who lives in one of these states, I'm nonplussed at the
announcement. It doesn't really seem to say anything concrete that I can tell.
"We've agreed that we will use metrics...we are still deciding what these
metrics will be."
I thought the Stay Home, Stay Healthy order (in Washington state) wasn't
unreasonable and wasn't just political babble, but I don't really know what to
make of this piece. Anyone want to try to explain it like I'm five?
~~~
JPKab
My brother lives in Oregon.
He was complaining to me the other day that Oregon's requirement that you are
not allowed to pump your own gas is an obvious COVID transmission vector.
To get gas in Oregon, he follows these steps:
1.) pull up to gas station 2.) Roll down window to hand attendant (who isn't
wearing a mask, nor is required to) his debit card. 3.) Get card back from
attendant. 4.) Sanitize the debit card 5.) Sanitize his hands 6.) Hope that
the attendant is washing his hands, and that he didn't breathe COVID onto him
through the open window.
Not saying this isn't a great, noble effort.
Just saying that, in some aspects, these states aren't doing a good job on
obvious things within their own borders.
Why the hell is Oregon still doing this, and not requiring masks from gas
attendants anyway? Maybe it is required, but my brother told me he has yet to
have an attendant wearing a mask, so possibly not enforced???
~~~
JoshTriplett
Oregon has already lifted the restriction on pumping your own gas, for exactly
this reason.
~~~
kelnos
Add "not allowed to pump your own gas" to the list of "things that are
completely unnecessary and stupid that have been removed because of COVID-19
and never need to come back".
~~~
xref
It’s a job creation/protection play, like grocery stores where the union won’t
allow self-checkout. No idea if the law is achieving its goals or if it just
exists because of historical inertia now, but that’s why it exists.
~~~
blaser-waffle
Where are there checkout clerk unions blocking self-check out? Not saying it's
impossible, but I've lived in WA, CA, VA, DC, TX, and NY, plus 1.5 years in
Australia, and going on 6 years in Canada and have consistently seen self
check-out everywhere.
~~~
arh68
Oregon, yet again?
[https://katu.com/news/local/initiative-looks-to-limit-
number...](https://katu.com/news/local/initiative-looks-to-limit-number-of-
self-checkout-machines-in-oregon)
------
aazaa
Possibly relevant:
> President Trump tweeted Monday that the "decision to open up the states"
> following shutdown measures taken to stop the spread of the coronavirus lies
> with him, not governors.
[https://www.axios.com/trump-coronavirus-reopening-
governors-...](https://www.axios.com/trump-coronavirus-reopening-governors-
states-3ce510ff-cd94-4b4a-89b8-58d8bca5d69f.html)
~~~
r00fus
Is this me or does this someone who wants control but not responsibility?
~~~
jlj
It's really someone who wants to own the success of others and blame others
for failures. Zero accountability yet still wanting to call the shots.
------
dahdum
Stay at home is doing enormous economic damage, which indirectly will cause
more deaths through decimated budgets of social and health programs.
Especially in CA where high progressive taxation amplifies the impact of
recessions.
So there's a lot of pressure to reopen but no political cover to do so. I
think the entire point of this pact is to provide that cover.
~~~
triceratops
> which indirectly will cause more deaths through decimated budgets of social
> and health programs.
Do you have any actual data for this?
~~~
dahdum
Unless you're arguing the budgets won't be cut or frozen, the data is the
success metrics of those programs.
~~~
triceratops
You're saying that cutting (meaning reducing, not zeroing) budgets of social
and health programs can kill up to 0.5% of the entire population of a state?
Because that's the CFR of the virus.
You're aware that even programs on reduced budgets can offer a reduced set of
services, and prioritize the ones with most impact? (i.e. most lifesaving
potential). And that their work will be far easier if there are fewer total
sick people in the system? Have you balanced the effects of that versus
letting the virus run amok so that the tax receipts aren't impacted?
That's why I'm asking for numbers. Otherwise "a bad economy will kill more
people" is a fuzzy assertion that sounds "cool" and "contrarian" but has no
actual substance.
Here are my numbers btw:
"Our finding that all-cause mortality decreased during the Great Recession is
consistent with previous studies. Some categories of cause-specific mortality,
notably cardiovascular disease, also follow this pattern, and are more
pronounced for certain gender and age groups. Our study also suggests that the
recent recession contributed to the growth in deaths from overdoses of
prescription drugs in working-age adults in metropolitan areas. Additional
research investigating the mechanisms underlying the health consequences of
macroeconomic conditions is warranted."[1]
In the last big recession, deaths due to certain causes went up but _overall
mortality decreased_. So what you're saying definitively _did not happen_ in
the only other comparable situation.
[https://www.ncbi.nlm.nih.gov/pubmed/28772108](https://www.ncbi.nlm.nih.gov/pubmed/28772108)
~~~
dahdum
> You're saying that cutting (meaning reducing, not zeroing) budgets of social
> and health programs can kill up to 0.5% of the entire population of a state?
> Because that's the CFR of the virus.
I'm not saying that, but I could have been more clear.
The Great Recession is not a great comparison to a continued lockdown
scenario, unemployment peaked at 10% while current reports are putting us
already at ~14% and rising. The recession reduced demand but didn't wipe out
sectors, nor did it cascade nearly as fast. Heart attacks and traffic
fatalities will go down as the studies show.
I don't think I'm being contrarian, these pacts were setup to provide the
political cover necessary for those qualified to make the call. My prediction
is CA/OR/WA reopen on roughly the same schedule as everyone else, with just
slight stricter restrictions. Such as quarantining the sick, isolating the
high-risk, and other measures to bring the CFR down.
------
new_time
I love the United States' flexible decentralized model. Different states have
sufficient autonomy to tinker and try different approaches, allowing the best
ideas to emerge from the collective. It's one reason why the US is such a
dynamic place, similar to Europe but with the additional benefit of unified
language, culture (broadly), economy, and high level regulations.
It's truly a great system.
~~~
wahern
It's not necessarily as flexible as to encompass such sub-federal cooperation:
"No State shall, without the Consent of Congress, lay any Duty of Tonnage,
keep Troops, or Ships of War in time of Peace, enter into any Agreement or
Compact with another State, or with a foreign Power, or engage in War, unless
actually invaded, or in such imminent Danger as will not admit of delay."
[https://en.wikipedia.org/wiki/Article_One_of_the_United_Stat...](https://en.wikipedia.org/wiki/Article_One_of_the_United_States_Constitution#Clause_3:_Compact_Clause)
I'm curious if these recently announced state pacts will be challenged in
court. But legal or illegal, it doesn't excuse the poor job of federal
management.
~~~
JoshTriplett
That would prevent any such pact from being _legally binding_ on the states.
It doesn't necessarily prevent states from coordinating at all.
~~~
new_time
Exactly, this seems like an friendly handshake agreement to cooperate with an
official name and press release.
OTOH, pacts like the popular vote compact signed by some states in the past
few years is almost certainly unconstitutional.
~~~
JoshTriplett
Is it? They're each deciding, independently, to handle their electors in a
certain fashion. They each have the right to independently change that
decision, without any penalty for doing so. It's not a binding agreement with
another state in any fashion.
------
Nuzzerino
From article 1, section 10 of the U.S. Constitution:
"No State shall, without the Consent of Congress, lay any Duty of Tonnage,
keep Troops, or Ships of War in time of Peace, enter into any Agreement or
Compact with another State, or with a foreign Power, or engage in War, unless
actually invaded, or in such imminent Danger as will not admit of delay."
This scares me a bit. This agreement seems innocuous at first, but the
constitution expressly forbids these (and all) kinds of agreements between
states for a reason. A small pact like this today could tomorrow become
leverage for secessionism or otherwise a power struggle with the federal
government.
~~~
kevingadd
"State" in that context clearly refers to foreign nations (states) not states
as in The United States Of America. Foreign power is distinct because it would
refer to things like governments-in-exile or perhaps resistance groups.
~~~
rlt
Why would they use two different meanings of "State" (capitalized, no less) in
the same sentence?
------
sandworm101
In my area (Vancouver Island, just north of this "pact") we are seeing
something very odd. People are going out. Families are having BBQs in local
parks. Restaurants aren't letting people dine in, but the grocery stores are
operating essentially normally. I'm near the water and am seeing kayaks and
boats out like any normal weekend. Vacation properties are in use. RV parks
have customers. I see groups of old people chatting on the street like normal.
I think we are on a tipping point. Government directive or no, people are
coming out of isolation. I don't think the local authorities have much choice
at this point. Either they make some clear changes and/or issue timetables, or
they risk loosing any control. These are Canadians, a people with national
healthcare and a general respect for government authority. When this attitude
hits America, a country that prides itself on individualism, things may move
from lockdown to "what lockdown?" in a matter of days.
~~~
alkonaut
Because people realize that kayaking isn’t a risk so why would they avoid it?
This is why no restrictions should be imposed that don’t have a very clear and
obvious effect. If that means they must rely on people’s ability to do the
right thing (e.g not take the bus to go kayaking, not go places where it’s
crowded) then so be it. It’s better to have a sustainable and accepted
lockdown that is 95% water tight, than an unsustainable and less accepted
lockdown. Otherwise, as you say, once people stop accepting the lockdowns
anyway.
~~~
sandworm101
We are told not to travel unless necessary. Kayaking is never necessary. The
Canadian military, the people responsible for search and rescue over HUGE
areas, are themselves locked into extraordinary isolation measures. Many are
living far from loved ones specifically to maintain operational readiness for
real national emergencies. They aren't going to be very happy if they have to
break isolation to rescue a kayak party too bored to stay home when asked so
to do by their government.
~~~
alkonaut
So you kayak in a lake within swimming distance of shore. This is the thing:
people need to be trusted to do the right thing. What we want to avoid wasn’t
people kayaking but people requiring rescue. That should just _work_ by asking
it of people.
~~~
sandworm101
Talk to any first responder. Playing around near shore results in plenty of
calls. It isn't just the big helicopter that pulls people off icebergs. A
needless ambulance callout, followed by a kid sitting in hospital with a badly
broken bone, is exactly the sort of thing we need to avoid. Doing anything
recreational on the water is to be avoided.
There is a horse farm near my house. They are curtailing riding as much as
possible specifically because they know that riding horses can be dangerous,
especially for kids. They don't want any trips to hospital right now.
------
zw123456
[http://nationofpacifica.com/](http://nationofpacifica.com/)
------
bjt2n3904
> Health outcomes and science – not politics – will guide these decisions.
And I definitely promise you that my news organization will have no bias or
slant, and will only report facts.
------
themodelplumber
Yes. I love this from an executive-action standpoint. They are _doing_
something about something, in the face of a lot of fearful messaging. And that
gives the rest of us an even better "emotional/energy vector" which is based
on a promise of government support and action in support of business.
We are participants in the most creative and powerful world economy of all
time. Regardless of geographical location, cultures are generally more unified
and geared toward working together than ever. Markets have shown an impressive
amount of resilience in the face of unprecedented fear messaging. People are
ready to work hard; in fact they're wearing themselves out by working harder
than they have before in many cases.
This is exactly the kind of action we need right now.
------
jeffdavis
I'm a little disturbed that I feel like the country has been tearing apart for
a while. There is a major political bifurcation, and now pacts between states.
Right now the pacts are for the pandemic, but what else might these coalitions
be used for in the future? I'm not saying that they'll be used for some
nafarious purpose, but the fact that they are like-minded politically makes it
more likely to deepen the divides.
~~~
atomi
Because, Conservatives have completely lost their minds. If they want to
commit to a suicide run to try and sustain the market then competent states
have the imperative to not follow them into the abyss. You have to remember,
each state is tasked with the welfare of their citizenry, not the furtherance
of the political ambitions of a federal administrator.
~~~
jeffdavis
I'm not sure what your point is. Are you saying "good riddance, let's just
split the country in a few pieces and get it over with"? Or do you see some
path toward unifying the two factions again?
I guess you could reasonably say that groups with different political
allegiances should just split up. But what's weird about that to me is it's
basically like the states' rights platform, sans Constitution.
------
maerF0x0
I have yet to see a cogent validation of the total lives saved by lockdown
when considering all other health impacts such as suicide rates from
isolation, heart disease/stroke/diabetes from extra sedentary lifestyle etc.
not to mention the suicide by fentanyl in a subsequent economic
recession/depression.
Any aware of such an analysis?
~~~
kelnos
I'm generally worried about this sort of thing as well, and wish people were
looking into this.
However, we have pretty good estimates on how many people _will_ die if we
don't shelter and isolate, and I think that trumps what would inevitably be
some very vague guesses as to how many people _might_ die as side effects of
sheltering and isolation.
And regardless, many of those potential deaths could be preventable with the
right focus on mental health and financial support during and after the
lockdown. (Not saying we do have the right focus; I expect we definitely
don't. But that's not a reason to give up on isolation.)
~~~
maerF0x0
> we have pretty good estimates on how many people will die if we don't
> shelter and isolate
This is a fantastic point/rebuttal and has helped shift my perspective
somewhat. We can do things about tomorrow's suicides, heart attacks and
diabetes later, though not too too much later.
~~~
kelnos
Right, I think it's a "one thing at a time" sort of situation. Like: we have a
clear problem with a potential solution that will mitigate the problem
immediately. That solution may create _other_ problems, but we do have some
time (though, as you suggest, not a _lot_ ) to try to fix that up after
dealing with the first problem.
------
nostromo
> We need to see a decline in the rate of spread of the virus before large-
> scale reopening, and we will be working in coordination to identify the best
> metrics to guide this.
What's the end game here?
If you slow the spread, and then reopen, it'll spread again. Nothing will have
changed, you'll just have delayed it a bit.
~~~
kevingadd
If you read the announcement:
> Ensuring an ability to care for those who may become sick with COVID-19 and
> other conditions. This will require adequate hospital surge capacity and
> supplies of personal protective equipment.
Delaying it is the point, it ensures we have enough hospital capacity and PPE
to handle the inevitable burst of cases when reopening. This is what people
mean when they say "flatten the curve". If we simply open whenever and aren't
prepared we'll have a massive spike in cases that exceeds hospital capacity,
vs the current situation where the case growth is slower and hospitals are
less burdened.
~~~
nostromo
At the current rate of infection that plan will require many years of
quarantine. We need more people exposed, not fewer.
~~~
kelnos
That assumes no investment in upgrading our hospital capacity and testing &
PPE production.
At one end of the spectrum you have (mythically) infinite hospital capacity
and staffing, testing capacity, and contact tracing ability. In that world,
you reopen everything, and let people go about their lives.
At _current_ capacity, we need to slow things down. If we can increase
capacity enough in the coming weeks and months, we'll need to slow things down
much less.
~~~
acid__
Things were quite bad but they've improved rapidly over the past weeks and
months.
Hospitals have actually, counter-intuitively, seen a reduction in usage. The
number of hospitalizations has been far lower than expected (~20% of IHME's
modeled values) and when combined with the fact that non-COVID patients have
been avoiding hospitals (even for serious emergencies like heart attacks and
strokes!), many are reporting entire wards being completely unused according
to reports from medical professionals on Twitter, Facebook, and r/medicine.
Also, 150,000 test results are coming in every day. That's 5x the per-capita
testing rate of South Korea (~5,000 tests per day) and while we certainly can
and should increase testing dramatically, the US is actually doing pretty well
on testing relative to global numbers.
~~~
robocat
South Korea has excess testing capacity:
[https://thediplomat.com/2020/04/south-korea-ramps-up-
exports...](https://thediplomat.com/2020/04/south-korea-ramps-up-exports-of-
covid-19-testing-kits/)
So far they have done 500000 tests, but they got started early, and controlled
their outbreak much better. In recent news the US just bought 600000 tests
from South Korea.
“The South Korean companies behind the kits are now churning out enough to
test at least 135,000 people per day. The stabilizing situation at home has
enabled these firms to export more of this extra capacity.”
------
supernova87a
Regardless of how sad a situation this reflects, I regard it at least as a
positive sign of some resilience of our federal structure.
Criticize the state/federal model though you might for how it sometimes
dilutes the ability to get things done in a unified way, at least in times of
national or federal-level dysfunction, the people who can step in locally to
provide some guidance and coordination are able to do so, and not actually be
stymied or prevented from doing so by some authoritarian framework.
Imagine instead if we lived in a country with more organized / powerful
central control, but was both authoritative _and_ incompetent. We would be
really fucked.
------
jMyles
I'm in Oregon.
My first read: CTRL-F for "test" to see if there's any new statement on
serological testing. Nothing.
This is all we want right now. This is the first priority, and whatever is
second is somewhat distant.
Not having widespread serological testing to understand the prevalence in our
communities really gives a feeling of being in a third-world country.
It's awesome that Drs. Bhattacharya and Bendavid have pushed the Stanford
study forward (hopefully we have results this week). But seriously, there's no
excuse for not having already done a round of random testing in every city,
several times.
~~~
alkonaut
Serological testing is really bleeding edge still. It’s only done as part of
research projects as far as I heard, and you can’t really buy the capacity to
do it on an industrial scale yet. Perhaps this varies between countries (where
manufacturers exist and so on) but many countries in Europe still have few or
no serological tests done, and we are weeks ahead of the US in terms of the
epidemic progress.
~~~
jMyles
Yeah, that's the point right there. Why the heck is this so bleeding edge? A
small team at Stanford made it happen rather quickly. With a small budget.
And yet our leaders aren't even _talking_ about it. Where's the will to do
this obvious first priority?
------
ambivalents
Naive question: how is this better than each of these states doing its own
thing (even if those things end up looking similar)? What does California gain
by teaming up with WA and OR, for example?
~~~
alkonaut
Look at the Netherlands and Belgium screaming at each other and threatening to
close borders because one felt the other had too little restrictions.
Let’s just say it’s very much bad for your neighborhood of states to have that
kind of situation. Closing borders or having diplomatic disasters isn’t worth
it. It’s better to reach a common decision.
~~~
rlt
I'd rather everyone close their borders for now (except for cargo and
essential travel, with testing/quarantine) and countries/states/regions
respond however they see fit for their own populations. The regions with the
best responses can be examples for the others to follow.
------
Doctor_Fegg
Floreat Cascadia!
~~~
klodolph
Not to be pedantic or anything but Cascadia definitely doesn’t include
California. Nuh-uh, no way José, no siree.
~~~
domingobingo
It’s the economic core of Cascadia. It’s a necessary thing.
~~~
klodolph
Kind of like saying that the US is the “economic core” of Canada.
~~~
domingobingo
How do you figure?
------
whatsmyusername
They should do a shared healthcare system as well. The future is states
working together on their own thing, while letting chudland sink further into
poverty.
------
jedberg
I’m surprised they didn’t include Nevada given how much of California’s
population travels to Nevada.
------
nerfhammer
Meanwhile, 7 northeastern states are forming a "multi-state council"
------
seemslegit
Some historians go as early as 2020 and see the "Western States Pact" as the
first establishing document for the Pacific Union and the beginning of the
unraveling of the USA.
Nt Rly though.
------
mullingitover
This is great, and it's a big chunk of population, but I keep feeling like we
could do better. For example, these three States could do a great job, but
there's nothing stopping a bunch of the other States from bungling it (like,
say, ignoring health experts and exempting certain groups from mass gathering
restrictions). Then those States end up exporting infectious individuals into
the States which are doing things right, and creating more clusters.
If only we had some sort of united group of all the States, where the response
could be coordinated and centralized for maximum effectiveness.
~~~
bjt2n3904
> If only we had some sort of united group of all the States, where the
> response could be coordinated and centralized for maximum effectiveness.
We do! It's called the United States government. I'll be letting the president
know you're interested in him taking over and using his own ideas, instead of
this one you seem to be particularly fond of.
Or, if that idea is slightly terrifying for you, perhaps it's a better idea
that the states are handling this themselves. Your state conglomeration thing
can always decide to use the National Guard to prevent outside travelers.
~~~
anoncareer0212
You have no idea how crazy you sound as a European lol, the most
straightforward way to put it is you're using an anonymous account on a
website to dunk on someone in the comments section for implying they may care
about avoiding infection during a pandemic
~~~
nostromo
> You have no idea how crazy you sound as a European
Europe, a group of many independent states, trying different approaches to a
common problem, with an over-arching bureaucracy to coordinate between them...
and the US approach sounds crazy to you?
~~~
samsonradu
You're comparing apples to oranges. Europe is comprised of sovereign states
which can and actually have closed their borders. From my understanding, but
correct me if I'm wrong, this would be a legal and political nightmare in the
US right?
Edit: not endorsing parent comment
------
brewdad
So we're going to need a new name for this future sovereignty. Sierra
Cascadia? New Pacifica?
~~~
ummwhat
California and Cascadia Republic. Put a tree on the bear flag and call it a
day.
~~~
runarberg
Something like this?
[https://imgur.com/a/tEkyAXt](https://imgur.com/a/tEkyAXt)
------
yingw787
Well, that was quick.
[https://news.ycombinator.com/item?id=22828354](https://news.ycombinator.com/item?id=22828354)
Today it is pandemic resolution. I fear that tomorrow, it might be economic
ties, lockstep voting at the federal level, and National Guard drills,
deployments, and contingency planning, where the West Coast slowly splits away
from the rest of the country, with other states teaming up and following suit.
This isn't the end, and it's the only appropriate response to a federal
government that cannot or will not execute on behalf of all states. It is,
however, a reflection of the true state of our Union, and a sign of things to
come if we don't reconcile our national identity.
I'm saddened, but not surprised, that a step like this was necessary given the
circumstances.
~~~
spankalee
> I fear that tomorrow
Why fear? This is my dream!
The Senate is irreparably broken. There is no conceivable way that it can
become more fairly representative of the people without a breakup of the
United States. The smaller less populous states aren't going to just give up
their outsized power voluntarily.
California is suffering the most due to the Senate's structural unfairness,
and it has the most to gain from succession. I hope it happens in my lifetime.
~~~
yingw787
Dude...if we're talking about secession, we're talking about civil war. In a
country with the second most nuclear weapons in the world. With a federal
government already looking to adversaries for cooperation.
You're talking nuclear missiles or chemical weapons fired against San
Francisco, Los Angeles, or Seattle from land-based silos in the Dakotas, or
mutinies aboard our SSBNs, or Russian troops landing en masse in South
Carolina to offer foreign assistance. It will mean the Balkanization of the
country, where California has to station permanent garrisons in the Sierra
Nevada mountains to ward off invasions from Texas or Mexico. You're not
talking about a 40% drop in GDP this quarter, you're talking about famine and
starvation, and complete international irrelevance after the dust settles. If
only tens of millions of Americans die, that probably counts as a win.
"We are not enemies, but friends. We must not be enemies. Though passion may
have strained, it must not break our bonds of affection. The mystic chords of
memory will swell when again touched, as surely they will be, by the better
angels of our nature."
~~~
nostrademons
> You're talking nuclear missiles or chemical weapons fired against San
> Francisco, Los Angeles, or Seattle from land-based silos in the Dakotas.
This is pretty unlikely, if only because the fallout from nuclear weapons
launched at the West Coast will blow back over the rest of the country,
including over the regions that launch the missiles themselves. There's a
reason short-range and battlefield nukes were never used in anger and it has
nothing to do with morals.
The rest of your point stands. From a global perspective, a break-up of the
U.S. is a very bad thing. Aside from the potential civil war here, it also
means open-season on any countries that are basically U.S. protectorates
(Taiwan, Japan, Israel, and much of Europe are toast); it hands global
leadership over to China; it means financial chaos; and it could result in
famines and food instability throughout the world (the U.S. is a major food
exporter).
~~~
yingw787
I'm not quite sure. We only have nukes in hundreds of kilotons, which might
not be enough to crack steel and concrete and burn all combustibles at once.
If there's no firestorm, the nuclear fallout might not make it to the
stratosphere and the jet stream because it gets dammed in by the mountains and
precipitates out.
Not to mention, hatred is irrational anyways. At this stage, we're talking
about an organization that will reduce its own negotiating lever on the world
stage in order to subjugate innocent civilians, which we've already seen take
place at various points in Syria and the Soviet Union. I would expect
political commissars taking over the nuclear command authority to turn the
keys, not USAF or USN personnel.
------
wahern
If wonder if Barr will sue, arguing that the pact violates the Interstate
Compact Clause, especially considering that Trump has recently tweeted that
only _he_ gets to decide when to lift restrictions.
~~~
jameslevy
I hope he does, and in the process both increases his ownership of this fiasco
and dispels any notions that his party still has any real support for "States'
rights".
------
iloveyouocean
Waiting for the announcement of the formation of the 'Confederacy of Southern
States' (I don't believe for a moment that Trump and the Republicans would
hesitate to seize the opportunity and claim that it's simply 'parity'). In
these times of hyper-partisanship, in the midst of a global pandemic, on the
tip of a great recession or depression with untold economic hardship; I
shudder at the potential for what politically aligned states banding together
could lead to.
I hope I am overreacting, but my parameters for what I used to consider normal
and possible have been shifted significantly and there seems to have formed a
great open space of terrible possibility to be filled.
~~~
sketchyj
What about this inspired you to compare it to the civil war? Seems hyperbolic
given states cooperate in similar ways all the time. It makes sense for
bordering states to be on the same page here, _especially_ given the erratic
behavior exhibited by federal officials who will remain nameless.
------
redis_mlc
> Protecting the general public by ensuring any successful lifting of
> interventions includes the development of a system for testing, tracking and
> isolating. The states will work together to share best practices.
If there is a reasonable expectation of that happening in the next couple of
weeks, then ok. If this is just "hope", then we should end the lockdown and
let the flu run its course, like every year.
The overall mortality rate is about 2%, which may be unavoidable in the US
even with social distancing, since I don't see any way for the US to emulate
S. Korea's testing and tracing success.
Does anybody have any logical reason for thinking that the US can copy S.
Korea?
Does the US have any way to do up to 600 million tests per week and see the
result in an hour?
~~~
mulmen
Why would we need to test everyone twice a week?
| {
"pile_set_name": "HackerNews"
} |
Square Pricing Update FAQ - asperous
https://squareup.com/help/us/en/article/6700-square-pricing-update-faq
======
asperous
Their rates changed from 2.75% to (2.6% + 10c). This results in a higher fee
per transaction for any transaction under $66.
| {
"pile_set_name": "HackerNews"
} |
Ask HN: How to switch off from work? - sdeepak
Like mentally. Not just physically. Taking breaks, strategic time-outs are all ok but many times that isn't much of help.
======
EADGBE
You'll need hobbies, or just "something to do".
But most importantly, _you need a job where you can do actually do this_.
I've worked for a few startups (wow, ironic) where it was impossible to switch
off because everything was being thrown at me as often as possible.
Yeah, they had unlimited vacation and perks like "work from home", "summer
Fridays", "beer:30" but these were mostly ruses which helped them be even more
effective; not me.
The easiest way to start to separate your work from your life is to move into
a boring job. Sorry.
Moving fast and breaking things is cool. Changing the world is super-cool. But
it's mostly passion-based work. In the heat of passion, a lot of things get
overlooked.
------
cimmanom
An active commute. Walking, running, cycling. At least 20 minutes. Let your
mind wander. Do not check or answer work communications during your commute.
If you can get away with it, don’t check/answer them once your commute is over
either.
If you work from home, first off, set aside a space that is dedicated to work.
Secondly, create rituals for leaving work that include activities you never do
while working. Maybe that’s reading fiction or meditating or yoga or lighting
candles or singing karaoke or whatever. Your ritual should take at least 20
minutes, and not be allowed to be interrupted by work.
------
x0hm
1) Weed
2) Get in the habit of writing down everything that's in your head. I keep a
simple bullet journal so that everything in my head has a single place to go.
Note: It's better to actually physically WRITE it than to type it.
3) Hobbies. Find something your passionate about that you look forward to
doing when you're off work. That'll keep your brain off of the other stuff.
4) Exercise. A lot of time, mental anxiety can be cured with physical effort.
Those are ways I've successfully done it. Any of them work for me, but they're
ordered here by the most effective (for me).
Just find a way to get it out of your head, and it should stay out until you
get bored.
------
wingerlang
For a specific situation where you cannot switch off because you have all your
work on your computer, which you also use as a personal one -- I've started
using a separate user account for personal and work since almost a year.
The hassle of maintaining two setups, having to log into the personal/work one
to do something even small has helped me almost completely tune out work when
I don't need it.
------
brokenmachine
Sport is where I go to forget everything except what I'm doing _right now_.
Tennis is my sport, and takes a lot of concentration to play well. Loss of
focus is death in tennis.
Also all the focusing my eyes from near to far over and over again I think
evens out the hours spent staring at a screen two feet away.
------
cerberusss
Gaming. Play Fallout for an hour straight and I guarantee you're no longer
thinking about work.
You probably don't want to sit behind a desk with a keyboard and mouse,
though. Get a console.
~~~
ssijak
I was gaming a lot when I was younger. Now, I just cant get into a game that I
know I would like as a kid because it really feels like time wasting, even
when I just want to chill out. Dont know how to remove that feeling. But only
games that do not feel like that are thing like PES played against friends in
RL.
------
Jabberwockie
weed
~~~
Jabberwockie
\- friends \- go out \- do sports \- go to social courses
| {
"pile_set_name": "HackerNews"
} |
LifeLock CEO’s Identity Stolen 13 Times - edw519
http://www.wired.com/threatlevel/2010/05/lifelock-identity-theft?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+wired%2Findex+%28Wired%3A+Index+3+%28Top+Stories+2%29%29&utm_content=My+Yahoo
======
bullseye
I wrote an application for a banking client a few years ago that required a
valid SSN. In addition to using the information available on ssa.gov to check
the validity of a number, I built a simple filter to exclude valid, but
otherwise fraudulent numbers. While none of the steps I took are a bulletproof
measure against identity theft, they do lighten the load a bit.
I cross-referenced the SSN death index to ensure dead people had not risen
from the grave to apply for credit. I also excluded the popular "fake" SSNs
used in advertising
([http://en.wikipedia.org/wiki/Social_Security_number#SSNs_inv...](http://en.wikipedia.org/wiki/Social_Security_number#SSNs_invalidated_by_use_in_advertising)),
and _I most definitely added Todd Davis's number to the list_. This last step
seemed like a no-brainer given all of the publicity at the time.
While I can understand a small boutique store not going to those lengths to
prevent a fraudulent account, I am a little surprised that AT&T and Verizon
were among the casualties.
~~~
mey
According to the SSNVS service information, you are supposed to only use this
information for correctly completing IRS W-2.
Verifying the SSN using that service for banking (as I'm reading it) is a
clear violation of the system.
Reference (<http://www.ssa.gov/employer/ssnvshandbk/ssnvs_bso.htm>)
If you were using a different service I'd be interested, as we are always
looking for new ways to do validation of accounts.
~~~
bullseye
We weren't able to use the online verification service , which is the reason I
wrote the program I described. It didn't validate a "good" SSN, but rather
filtered out bad ones.
<http://www.ssa.gov/> and <http://www.socialsecurity.gov> both have a lot of
useful information about structure and allocation of SSNs if you are looking
to do something similar.
------
chime
The entire identify-theft problem can be solved by a very simple mechanism. If
I apply for a loan for a car, the dealer takes my info and tries to run my
credit online. Immediately I get an automated phone call that says "Dealer ABC
is trying to sign you up for service: AUTOLOAN. To allow this service, enter
your 4 digit pin." If I do not have my cellphone, I can directly call a 1-800
number, enter my SSN + PIN and confirm the sign up. I do NOT have to provide
any vendor with my PIN.
Who manages/offers this service? Experian/TransUnion etc. could do this for a
very small fee. Sure, there would be the issue of lost PINs, unavailability of
Internet access, not having your cell on you etc. but I think it could work
very well. Right now, it is possible for someone to find out my SSN# from a
piece of paper from a trashcan and immediately buy a phone in my name. At
least I can change my pin if someone finds out.
~~~
recampbell
Heh.
There was such a product provided by Debix (<http://debix.com>). It relied
upon a law called the Fair Credit Reporting Act which allowed consumers to
place a fraud alert on their credit file, which the creditor was supposed to
call. Debix placed the fraud alert on behalf of consumers, but directed the
creditor to call Debix which delivered the credit request using exactly such
an authentication mechanism that you describe. This was 2003.
Lifelock used the same mechanism (though without the phone authentication,
IIRC). Experian sued Lifelock saying that the FCRA did not allow for
_companies_ to set fraud alerts on behalf of consumers, only consumers were
allowed to set them. In May of last year, a judge agreed with Experian, and
Lifelock later settled and stopped using fraud alerts.
<http://www.finextra.com/news/fullstory.aspx?newsitemid=20078>
Unfortunately, this ruling also meant that Debix could no longer set fraud
alerts, so they had to cancel this product.
The truth is such a product creates friction in the instant credit market,
which is a huge source of income for credit bureaus. So they have very little
incentive to slow that process down and would rather just catch any exceptions
using monitoring.
The credit bureaus are an industry crying out for disruption. These guys are
dinosaurs and are living it large because there is no real alternative.
Unfortunately, they also seem to have plenty of political capital to prevent
any real legislative reform in this area.
Disclosure: I used to work for Debix and have ownership in the company.
~~~
Oxryly
Would you recommend Debix's OnCall service? How does it work?
------
sriramk
the real issue here is everyone assuming social security numbers are meant to
be 'secret'. It is a terrible way to authenticate someone. There have been
recent studies to show how non-random someone's number actually is.
Someone recently suggested the 'nuclear' option of making everyone's social
security number public and forcing all institutions to figure out a better
model. This may be too extreme but something like that may be necessary
~~~
orblivion
Hell, they might as well. Our college used them as ID numbers for the first
year I was there. We wrote it on all our tests. Forget about it, that stuff
isn't secure.
~~~
RK
My brother had his identity stolen while he was in school. The perpetrator
turned out to be a person who worked in the registrar's office!
When I started grad school they used SSN as the ID number too. I went to the
registrar and asked them to change mine. They said they couldn't do it
because, as a TA, I was considered an employee and they "had to" use my SSN.
You can imagine I wasn't happy.
------
keltex
Unfortunately these "credit monitoring services" are basically useless. The
only real solution is to "freeze your credit" which makes credit inaccessible
to anyone unless you provide an unlock code (which temporarily "thaws" your
credit). The cost ranges from $3-$10 per person per bureau to freeze a credit
report which is considerably cheaper than the $10/month lifelock service. More
information on how to do this here:
<http://clarkhoward.com/topics/credit_freeze_states.html>
~~~
natrius
Isn't that kind of extortion? "We've collected all of this data about you, and
we'll give it to anyone unless you pay us some protection money."
~~~
kareemm
absolutely. i've had "build a better credit bureau" on my ideas list for a
couple years now. it's a multi-billion dollar industry that could be waaaay
more consumer-friendly.
~~~
run4yourlives
It is consumer friendly.
Credit bureaus are designed to protect lenders, not lendees.
~~~
kareemm
lenders are customers, not consumers. but lendees are customers in teh same
way facebook's users are customers.
experian and transunion build a business off the backs of consumer data.
consumers are reliant on their records, but generally have to pay to get
access to them, even to correct a report.
like i said - it's a big, staid, slow-changing opaque industry. looking at it
through the right lens makes for a big opportunity.
~~~
run4yourlives
>lenders are customers, not consumers.
I think you need to re-read that, then explain the difference.
A "consumer" is not a class of person, it is the act of being a customer.
------
prodigal_erik
> It’s not fair to [AT&T] because they’re losing a pretty substantial amount
> of money.
AT&T isn't even bothering to check photo ID. Being defrauded is a risk they
have _eagerly_ assumed. Presumably they make more money this way, despite
fraud.
~~~
matwood
Exactly. I put a majority of the blame on these companies that for some reason
can't be bothered to check an ID.
On the back of my debit and credit card I sign it with "Check ID" since the
cashier is supposed to at minimum verify the signature. I've had cards stolen
multiple times and have had them used before I could cancel them. So much for
verifying the signature.
~~~
prodigal_erik
The way I've heard it, your signature on the card is actually your acceptance
of the contract with the issuing bank, so they are Not Okay with "check id"
there instead. The merchant is just expected to verify that the card has been
signed, not try to match signatures (doing that correctly requires very rare
expertise).
~~~
matwood
Obviously I don't expect them to do a point by point comparison of the
signature, but they are supposed to at least make sure it's there. Writing
'Check ID' should get them to check my ID every time, but so few even look at
the card.
This is a funny prank where a guy went out trying to get people to actually
look at the back of the card :)
<http://www.zug.com/pranks/credit_card/index2.html>
------
ajg1977
Shocking! Next we'll probably find out that the guy from Video Professor
doesn't actually have a doctorate..
~~~
orblivion
Well you gotta admit, you can't accuse this guy of lying, or not putting his
money where his mouth is.
~~~
mahmud
Your American "identity" is a very cheap business expense if you know you have
an "offshore" ID, accounts, and private Island. Cash out, burn house and move
out.
------
ciupicri
There's something that I don't understand about these identity theft cases. If
I didn't really sign any document, why should I be held responsible just
because someone else used something public (non-private) about me?
P.S. To be more clear: the company giving the loan should prove that I signed
the documents, not I that I didn't sign them. The presumption of innocence if
you will.
~~~
smallblacksun
Legally, you aren't. The issue is the time and expense in proving that you
didn't sign anything.
~~~
matwood
This is why I think companies that incorrectly send you to collections or give
credit to someone using your identity should not only be on the hook for the
money they lost but also liable to the person they are forcing to clear their
name.
Years ago I rented at a crappy apartment complex. When I left their check out
basically meant you always owed them ~$200. I paid and moved out of state. 6
months later I get a collections call saying I didn't pay the bill. I told
them I paid it, she said it wasn't and said it was going on my report unless I
paid that day. Luckily I paid by check and my bank (like all banks I guess
now) keeps canceled checks online for pretty much ever. So now I had to go
back 6 months and find this check then call the apartment then the collections
agency, etc... A HUGE hassle and time waster for me all because the apartment
complex employed incompetent people.
The kicker was that the girl trying to collect from me said "people make
mistakes and you can't blame them." Um, when I make a mistake and forget to
pay a bill you guys jump all over me. You make a mistake and it's still my
problem to solve.
------
jsdalton
I knew I'd seen this ad somewhere before:
[http://37signals.com/svn/posts/353-fly-on-the-wall-
lifelock-...](http://37signals.com/svn/posts/353-fly-on-the-wall-lifelock-
motionbox-print-stylesheets-shoe-repair-posts-dordoni-table-and-daring-
fireball-ad)
~~~
ajg1977
I think you're probably just being snarky, but it is a pretty great marketing
concept.
After all, it's not always the case that product->quality ==
marketing->quality.
~~~
jsdalton
Actually, I remember reading it and having the exact same reaction the 37
signals guys did in their thread -- so I certainly wasn't immune to the
marketing ploy myself.
------
pwhelan
If it has only been 13 times, he's lucky. Don't go challenging criminals to
screw you over and giving them a crucial piece of information.
Got what he deserved, especiall yconsidering he was fined for deceptive
advertising because of crappy security.
~~~
jotto
i guess this could serve as a kind of honeypot so the company can observe the
attacks on this guy and then improve the service. but according to the FTC in
this article their service doesn't work so apparently it was nothing more than
a marketing stunt.
------
smiler
Good job he's got a $1 million compensation fund to cover him.
~~~
recampbell
Read the fine print. This covers Lifelocks costs in trying to restore your
credit, not any loses you sustain due to having your identity stolen.
<http://www.lifelock.com/our-guarantee>
Money quote: "Under the Terms and Conditions, NO money passes directly to our
LifeLock members."
[http://www.lifelock.com/about-us/about-lifelock/terms-and-
co...](http://www.lifelock.com/about-us/about-lifelock/terms-and-conditions)
"LifeLock will retain and pay for those third party professional services that
are reasonably necessary in LifeLock's judgment to assist you in restoring
losses or recovering your lost out-of-pocket expenses caused by such fraud. "
Disclosure: I worked for and have ownership in a competitor to Lifelock.
~~~
alexyim
"Policy change!"
------
DanielBMarkham
Davis -- the human identity-theft honeypot.
We need more of him.
~~~
Tichy
Maybe he could have used a fake SSN that says "fraud going on" loud and clear.
Are honeypot SSNs possible?
------
kwyjibo
It's really funny that a SSN is enough in the USA to get somebody else in so
much trouble.
Doesn't work in the other countries, as long as you don't send in copies of
your passport or identity card to claim a fake lottery win ;).
------
pedalpete
What I find so surprising is the low dollar amounts that were racked up.
Sub 10k in fraudulent charges on an SSN that is published? Like this?
According to Wikipedia, Identity theft doesn't result in the high dollar
figures I was expecting
[http://en.wikipedia.org/wiki/Identity_theft#Spread_and_impac...](http://en.wikipedia.org/wiki/Identity_theft#Spread_and_impact)
------
lukeqsee
This goes to show how crafty identity thieves really are -- and how stupid it
is to let them get any bit of private data. If they can steal his identity,
why not yours? He has staked his reputation on LifeLock's services, and lost.
Maybe this will knock some sense into the share-all generation.
~~~
pxlpshr
Exactly why people have a right to be up in arms about Facebook changing
privacy policies without allowing users to opt-in voluntarily.
Identity theft is a serious issue, most young techies haven't been a victim
simply because time and risk haven't converged. It can impact your life for
years, making it extremely difficult to get a mortgage, car loan, or even land
a job in some cases.
~~~
lukeqsee
Yup. I myself am a "young techy" I started out not caring about the little
bits of data I let out. But now I'm really clamping down, identity theft is
too real to take a risk.
------
robinduckett
Similar thing happened to Jeremy Clarkson when he put his full bank account
details in his newspaper column, thinking that no one could actually withdraw
money from his account - he was wrong, someone used his details to sign up for
charity direct debits.
------
rmorrison
While there are obviously some issues with LifeLock, I really appreciate the
confidence he has in his product.
I wish there was a way for all CEOs to do something similar. Too bad it's kind
of difficult for, say, a social web service.
------
maxklein
Why don't you just implement ID cards like in europe and skip this social
security nonsense? And ID card has your face and height on it, making it more
difficult for someone to pass as you.
------
melling
Seems like <http://www.lifelock.com> is getting a little slow.
Everyone is seeing if his ss# is still there?
------
RabidChihuahua
Oh...so much for that security.
| {
"pile_set_name": "HackerNews"
} |
Ask HN: What are the leading causes of cancer in San Francisco? - sean_patel
I saw this story and was alarmed. Why are there so many incidences of cancer in San Francisco for a relatively small population?<p>http://www.ucsf.edu/news/2016/11/404931/broad-new-partnership-launches-plan-reduce-cancer-san-francisco<p>Is it something to do with the old construction, or is it the air, or is it lifestyle? I was under the impression that most San Franciscans (including myself) lead an active healthy life(style).
======
dave_sullivan
A few things:
Cancer is basically the most likely thing to kill you besides heart disease. A
physically active population will have less death by heart disease and thus
more by cancer (because something has to kill you). So perhaps there's just
less heart disease in SF.
They say things like "rates are soaring" without saying how much and if it's
different than national averages.
This article has very little substance and is basically stating the fact that
a lot of people get cancer. IIRC, any given person today has a 25% chance of
getting cancer in their lifetime.
So don't worry: if you exercise and eat right, cancer will eventually get you
wherever you live.
(I should acknowledge the obvious of some environments are literally
carcinogenic, so if you live in a coal mine, 19th century London, or modern
Beijing, there's more cause for concern)
For anyone looking to learn more about cancer and its history, the book The
Emperor of All Maladies is very good.
~~~
taneq
> Cancer is basically the most likely thing to kill you besides heart disease.
> A physically active population will have less death by heart disease and
> thus more by cancer (because something has to kill you). So perhaps there's
> just less heart disease in SF.
True, and I'd take it a step further: Cancer is one of the two fundamental
failure modes of the human body. (The other one being death by senescence,
where cells self-terminate as they reach the Hayflick limit - which they do in
order to reduce the chances of a cancerous mutation.)
~~~
Gibbon1
Counter wisdom though the incidence of some types of cancer peeks and then
falls with age. Testicular cancer is very pronounced the vast majority of
victims are young men.
------
icegreentea
Because the story is useless. Story shows that cancer death rate in SF is
~160-200 per 100,000 depending on demographic. Overall US cancer death rate is
about 170 per 100,000 ([https://www.cancer.gov/about-
cancer/understanding/statistics](https://www.cancer.gov/about-
cancer/understanding/statistics)).
The "real" story is SF has a lower rate of death from heart disease (which is
the leading national killer) than usual. And for reference, heart disease and
cancer nationally are almost neck and neck
([http://www.cdc.gov/nchs/fastats/leading-causes-of-
death.htm](http://www.cdc.gov/nchs/fastats/leading-causes-of-death.htm))
So yeah, it's a useless story.
------
lj3
According to SFHip[0], San Francisco's lung cancer cases per 100,000 is 49 and
dropping each year. Nation-wide, it's 62.4.
The linked article is light on stats and heavy on rhetoric. Maybe they're
using it to drum up funding?
[0]:
[http://www.sfhip.org/index.php?module=Indicators&controller=...](http://www.sfhip.org/index.php?module=Indicators&controller=index&action=view&indicatorId=303&localeId=275)
------
Gibbon1
I tried skimming the article closely. It doesn't seem to say that there is
anything particularly unusual about San Francisco cancer rates. Since it's
UCSF I assume the article is the hospital doing its primary job of protecting
the residents of it's host city.
Risk factors, San Francisco is post industrial city with a significant number
of minorities, many foreign born.
Which brings up liver cancer. My understanding is liver cancer is highly
correlated with hepatitis B and C. A lot of South American and Asian emigrants
have been exposed to hepatitis B in their native countries. And they are thus
at higher risk. Hepatitis C is common among IV drug users. SF has a long
standing problem with IV drug use[1].
Also long standing population of heavy drinkers which also is associated with
cancers of various types.
Possible there is a legacy population of people exposed to industrial
carcinogens. And maybe some legacy carcinogens. Scuttlebutt is people living
close to the old unremediated industrial areas have more health problems.
Course they also tend to be poor which is a risk factor itself.
[1] Once found five 20 something white guys in suits passed out on Natoma
Street at three in the afternoon. Two in front of my door and three farther
down the block.
~~~
sean_patel
> Once found five 20 something white guys in suits passed out on Natoma Street
> at three in the afternoon.
Wow. Castro? That's pretty crazy.
~~~
Gibbon1
South of market. Guys looked totally like any other guys in suits you see in
the Financial district.
------
sjg007
This is about increasing awareness and reducing the mortality rate by
screening at risk groups that have traditionally lacked access to
comprehensive care. It's not about rising cancer rates in general that aren't
otherwise explained.
------
stevebmark
Why would anyone on Hacker News know the answer to this question?
~~~
foota
There's a lot of people on hacker news with a lot of strange knowledge.
~~~
sean_patel
> There's a lot of people on hacker news with a lot of strange knowledge.
Exactly! And the comments speak for themselves.
------
happycodework
[https://np.reddit.com/r/AskScienceDiscussion/comments/59otfy...](https://np.reddit.com/r/AskScienceDiscussion/comments/59otfy/small_correllation_between_vaxx_rate_and_breast/)
I could use another brogrammer to verify some of what I'm seeing. I expected I
could just mention this to a researcher and be done, but I'm being attacked
for mentioning possibilities.
There was a catalyst in 1975 when vaxx rates doubled and kept doubling every 5
years, the graphs reflect that, imho. Other countries data should confirm or
deny.
~~~
wavefunction
I downvoted you for using 'brogrammer' and for pushing bunk science.
Correlation Z: Rates of cancer have increased at the same time as the number
of articles claiming vaccines cause autism have increased.
| {
"pile_set_name": "HackerNews"
} |
Award-Winning High-Class Ultra Green Home Design in Canada: Midori Uchi - Mz
http://freshome.com/2014/06/04/award-winning-ultra-green-home-design-canada-midori-uchi/
======
Mz
Excerpt:
_Does your home produce more energy than it consumes? This one does... this
impressive Canadian home was named after its most prominent feature- its green
capabilities. “Midori Uchi” is Japanese for “Green Home”..._
And the article is filled with lovely photos.
| {
"pile_set_name": "HackerNews"
} |
The Future of Reading - ‘Reading Workshop’ Approach Lets Students Pick the Books - javanix
http://www.nytimes.com/2009/08/30/books/30reading.html?_r=1&hp
======
digamber_kamat
Very nice read
| {
"pile_set_name": "HackerNews"
} |
The Big Bitcoin Heist - yarapavan
https://www.vanityfair.com/news/2019/11/the-big-bitcoin-heist
======
mr_woozy
I'm in Iceland reading this now and almost every sentence of this article is
filled with embellishments and half truths
~~~
confiscate
care to give some examples? Because I am a curious reader who does not live in
Iceland and don't have the context to be able to identify the embellishments
~~~
mr_woozy
sure it's always frustrating because it does nothing to change the narrative
of what's repeated outside of Iceland regardless of the truth.
>"At 32, Stefansson is the most famous thief ever to emerge from this polite
and friendly island, ranked by the Global Peace Index as the world’s most
peaceful nation."
Patently false. So far our most famous criminal would be
(Tomas)[[https://www.nytimes.com/2017/02/07/world/europe/iceland-
murd...](https://www.nytimes.com/2017/02/07/world/europe/iceland-murder-
victim-birna-brjansdottir-autopsy.html)] or runner up for any of the various
banking / cartel that siphoned money out of the country during the crash.
Sindri doesn't even register with most people here except as a petty thug.
Bjarni Ben the old prime minister or Sigmundur David both appeared in the
Panama Papers here as well.
>It was cryptocurrency, ironically, that helped save Iceland after the bankers
bankrupted it. I have no idea where the author got this but Cryptocurrency has
provided any material benefit to Iceland in any form or shape, I say this even
though I'm a BTC advocate. First it was our fishing industry which was able to
sell high abroad and return with Euros to exchange for ISK and then it was the
tourism boom (as much as we all hate it here) from 2014-onwards. Crypto
currency mining here doesn't employ anyone, doesn't get aggressively taxed (it
should) and often gets industrial market rates on large enough consumption
(just like the aluminum smelters).
Oh and to correct another misunderstood thing about Iceland we get around 70%
of our power generation from HYDROpower not Geothermal like everyone things
(that goes towards home heating mostly). Which means .......Dams, lots and
lots of dams flooding areas of Iceland and destroying the nature the tourists
come to see.
> Today, Bitcoin mines consume more energy than all of Iceland’s homes
> combined.
This is just repeated ad-nauseam abroad now more than any other statement and
its entirely attributed to a single electrical engineer for the electric
producer company who was commenting on if the building trend kept at the same
rate back in 2018. It didn't obviously but any idiot journalist now stumbles
across some _other_ article saying it so they just repeat now too. Smelting
consumes FAR more electricity here than any other industry.
These are just a handful, nobody likes these guys here, no one is cheering for
them we're mostly embarrassed that they're gleefully unaware of badly they
make Iceland look international as the posterboys for big, dumb, meat-head
rubes, that were used by a foriegn criminal who never got caught. Classic
Icelandic hnakkar.
~~~
lostlogin
Thanks for this.
> Classic Icelandic hnakkar.
Google isn’t helping me much or I’m missing the point.
~~~
adamfeuer
This article explains a bit about it:
[http://icelandreport.blogspot.com/2006/03/necks.html](http://icelandreport.blogspot.com/2006/03/necks.html)
------
zepearl
> _In Iceland, it is not a crime to stage a prison break: The law recognizes
> that inmates, like all human beings, are naturally entitled to freedom, and
> thus cannot be punished for seeking it._
Funny :)
~~~
danielbln
The same is true in Germany. You'll probably break a few laws while breaking
out of a prison, but the act itself is not punishable and will not yield you
more time.
~~~
beerandt
What's the economic result? Any relative increase in attempts since there's no
incentive not to?
~~~
WJW
The sample size is so low you can't get any meaningful results. And use of
violence against a guard or breaking prison materials is still illegal (since
it's illegal to use violence against anyone and/or to destroy property). But
if the door is left open and you walk out without harming anyone, that in
itself is not illegal.
~~~
beerandt
So how does this work in the case of initial arrest? Or a fugitive thats not
necessarily am escapee? There's no incentive to surrender?
~~~
zepearl
> _Or a fugitive thats not necessarily am escapee?_
I'd guess that that person, not being at that point of time a "detainee",
would get the outcome of the other laws related to "resisting arrest" or
something similar (like when police wants to do a check on you and/or your
vehicle, etc... and you run off).
------
paulpauper
yeah he stole a bunch of stuff that is probably already obsolete as of
publication of this article. huge cost too in transporting and setting it up.
there are and still are guys who made as much as that or even more just
quietly running crypto giveaway scams on YouTube and twitter without all the
fanfare of an actual heist. of all the thefts and heists involving bitcoin,
this one is downright unimpressive in terms of dollar amount.
>Mr. X told Stefansson that he would give him 15 percent of the profits from
as many Bitcoin computers as he could steal from data centers across Iceland.
The total take, Stefansson calculated, could be as much as $1.2 million a
year—“forever.” Because, with the stolen computers, Stefansson and Mr. X would
establish their own Bitcoin mine.
yes because mining computers never become obsolete
~~~
mr_woozy
>yes because mining computers never become obsolete
exactly!
------
brenden2
Here's an AP version of the same story (works in private mode):
[https://apnews.com/55117fb55a714e909fb9aaf08841a5d6/Bitcoin-...](https://apnews.com/55117fb55a714e909fb9aaf08841a5d6/Bitcoin-
heist:-600-powerful-computers-stolen-in-Iceland)
~~~
brenden2
And also about the escape:
[https://apnews.com/db543dc5d2774098acf398984ac7324a/Suspect-...](https://apnews.com/db543dc5d2774098acf398984ac7324a/Suspect-
in-Iceland's-'Big-Bitcoin-Heist'-escapes-prison)
------
GuiA
I thought bitcoin mining hardware was only useful for a few months at best due
to rising complexity built in the protocol? If so the computers hidden by the
criminals won’t be very useful when they get out of prison a few years from
now.
~~~
seibelj
There is also a premium on fresh block rewards that have not been spent yet.
These are called “coinbase transactions” which is why the company is called
Coinbase. The reason these fresh coins are valuable is that they are untainted
by prior transactions and have various uses because of that. So mining BTC has
an additional premium above the spot price of BTC.
~~~
blotter_paper
> The reason these fresh coins are valuable is that they are untainted by
> prior transactions and have various uses because of that.
What "various uses" are you talking about? Are you talking about fears of
future blacklisting, or something else? If fears of future blacklisting, the
only use I can see would be speculative holding. I've written raw Bitcoin
transactions, and I don't know of any technical distinctions between coins
that would change how they can be used based on prior transactions.
~~~
seibelj
If you have fresh coins you can say you mined them which is good for financial
reasons. They also have zero prior history so companies like Chainalysis
cannot link them anywhere.
I really don’t have more to say on this subject, just think about it for a
while.
------
ur-whale
[http://archive.is/np6rz](http://archive.is/np6rz)
------
sschueller
March 2018
------
ajross
All I could think when reading that headline was "Which one?". Frankly this
incident (a physical theft of mining hardware) isn't even that interesting in
a "bitcoin" sense. Presumably the hardware was attractive because it's easily
liquidatable (being used by inherently gray market folks already) and not
because of any value from their bitcoin role per se.
------
unnouinceput
550 Bitcoin computers? I assume are the dedicated type, not general ones like
the one I use to type my comment. If so, those go obsolete in like half a
year. The heist, with its mined bitcoins, its maximum of several millions
(USD) at best, even at 20k USD/bitcoin. Become a politician, you'll get these
money and much more in those 4 years of a single mandate, all legal because
they are called donations.
~~~
kortilla
No: [https://www.fec.gov/help-candidates-and-committees/making-
di...](https://www.fec.gov/help-candidates-and-committees/making-
disbursements/personal-use/)
------
paulpauper
>In his teens, Stefansson graduated to drugs: pot, speed, cocaine, ecstasy,
LSD. By the time he turned 20, he was growing cannabis. His rap sheet soon
included 200 cases of petty crime. He broke into people’s homes to steal TVs
and stereos, and somehow managed to extract $10,000 from some slot machines in
a Reykjavík bar.
yikes. I'm sure he's hardly the only one. It makes you wonder if Iceland's
purported low crime and peacefulness is due to under-reporting of crime and
crime being ignored and excessively lenient sentencing guidelines, than
because of the absence of actual crime. If all the criminals keep being
released over and over, then the prison population will be very low yet there
will be crime everywhere anyway. lenient sentences heavily tips the moral
calculus in favor of criminals, both petty and major.Time is money. time
behind bars means less time for stealing.
~~~
mr_woozy
It is a large reason actually, we don't have enough well funded programs for
mental help or drug addiction and roughly 70% of prison inmates are there for
drugs and just cycle in and out running up huge fines which they'll never
actually be able to pay.
Also please keep in mind because of our sovereignity but low population we win
a lot of categories for "most this of any nation" due to sheer per-capita
mathematics. We're basically the most-per-capita country out there in many
ways.
| {
"pile_set_name": "HackerNews"
} |
Is Algebra Necessary? - ColinWright
http://www.nytimes.com/2012/07/29/opinion/sunday/is-algebra-necessary.html?_r=1&smid=tw-share&pagewanted=all
======
Steuard
It's easy to recognize that the author's arguments could apply just as well to
any academic subject: literature, history, you name it. ("We should just teach
'citizen reading', where students learn to read recipes and furniture assembly
instructions.")
But the real surprise to me is his ignorance of actual college math curricula.
He says, "Why not mathematics in art and music — even poetry — along with its
role in assorted sciences? The aim would be to treat mathematics as a liberal
art". I don't know about his college, but the math requirement where I teach
can be met with courses such as "Math in Art and Nature" or "Liberal Arts
Mathematics". It's not as if his suggestions there are novel! But here's the
kicker: for both of those classes, proficiency in algebra is a prerequisite.
It turns out that you can't really describe those topics that he likes without
actually using some math.
On the other hand, I _have_ come around to agree that statistics is more
broadly useful than calculus. Here's a TED talk by one of my old math
professors making that case:
[http://www.ted.com/talks/arthur_benjamin_s_formula_for_chang...](http://www.ted.com/talks/arthur_benjamin_s_formula_for_changing_math_education.html)
~~~
daleroberts
I don't get that TED talk. For example, how are you meant to properly
understand the ubiquitous Gaussian distribution without learning Calculus
first?
~~~
Steuard
I may not entirely understand your question. My wife has taught statistics
college statistics many times (including Gaussian distributions), and her
classes have never had a calculus prerequisite.
Are some topics easier to understand if you already know calculus? Sure. (I
assume she has to do the same sort of brief "area under a curve" explanations
there that I have to do when I teach algebra-based physics.) Can you
understand the topic in greater depth using calculus? Of course. But for a
first exposure to basic statistics I think you can mostly dodge the issue.
(And really, apart from already knowing the concept of an integral, does
knowing calculus really buy you much when studying Gaussian distributions? You
can't even _do_ those integrals! That frustration might be even more annoying
to a calculus student than to others.)
~~~
kaiwetzel
I've taken an introduction to statistics course with social science students
and based on that experience, I can relate to what you are saying, 100%.
However, I think there is a broad group of students[1] for which a
significantly earlier exposure to calculus would be beneficial and make
learning statistics (and physics) a lot easier or at least faster.
When I took introduction to statistics as a math major, I found the subject
extremely confusing because the discrete and continuous case where taught
completely disconnected and useful anchors for understanding such as basic
measure theory and Lebesgue integration where left out. That's certainly a
good way to teach for many but for some it doesn't work.
A similar case was physics for me (classical mechanics in particular). From
grade 5 to 10 (after which I avoided the subject) there was little insight
gained (e.g. heavy things fall down, there may be some friction, memorize all
those seemingly random formulas and if you use a long lever, make sure you
pick a strong material). Then I was exposed to an introduction to physics
course at university (for non-majors) and the revelation that all those random
formulas have a strong grounding in just 3 general principles and can then be
developed with some help from calculus was liberating. Just too late in my
case. Maybe I would have loved physics and actually study it, had they told me
in 7th grade that there is something tying all of it together, and the
ultimate goal of the class was to reach that summit. Just trying to show the
other side of the coin which should be integrated into the way math and
science is taught in schools in my opinion :-)
[1] Say, the top 5-10% of middle school students.
~~~
Steuard
Oh, don't get me wrong: _I_ was served very well by today's standard math
sequence. I learned fascinating stuff in precalc, and calculus was a
revelation and a profound joy. Prof. Benjamin's argument favoring statistics
instead was a tough sell for me.
But I'm a theoretical physicist. As much as I hate to say it, structuring the
entire standard math curriculum so it works best for kids like me (or even for
the top 10% of students) just isn't reasonable. (Ideally, a solid gifted
program could fill that gap.) I think that we agree on that.
I'd like to think that there are ways of introducing concepts from physics or
statistics that do highlight the underlying structure of the field, even if
the students don't yet know all of the math they'd need to work through the
details themselves. If I find a perfect way to do it, I'll let you know!
~~~
bunderbunder
One approach I've seen and thought was interesting is the course schedule
offered at the Illinois Math and Science Academy.
Their pre-calculus courses have been somewhat radically reorganized into a
curriculum called "mathematical investigations" which orders the topics
according to more of a practical progression. So, for example, bits of linear
algebra are pulled all the way up into precalc because they're useful in
geometry, and will also work better with the science curriculum. Perhaps
physics teachers inheriting students who already understand vectors, for
example.
Then the calculus curriculum is split into two tracks, one more basic, and a
more intensive one for students who anticipate going into fields that require
more calculus.
------
bluekeybox
Recent example. A family member who is an artist asked me about submission
rules to a guild competition (now, of all people, the artists are commonly
believed to require algebra knowledge the least). The flyer indicated, "the
photos/scans you submit must be 300dpi" (dots/pixels per inch). Which is
perfectly confusing because even a 100x100 pixel thumbnail can be considered
300dpi if you set its width to 1/3 inches. When I tried to explain said family
member that the guideline is confusing, that whoever wrote it probably failed
their high-school algebra, he started to mistrust what I'm saying. Words can't
explain how dumb the situation was: one idiot writes a guideline that can't be
followed (or rather one that can be satisfied even by a still from a security
camera), another takes what is written on blind faith and refuses to use logic
to understand why the guideline is insufficient.
Now imagine dealing with people who never completed high-school algebra but
who are trying to calculate dosage of a drug. According to a pharmacist I
knew, a child died on his watch because of incorrect dosage.
~~~
benzofuran
Hold on there hot shot, they're likely saying the required scanning resolution
of submittals. While not incredibly clear, 300dpi is the general high-
resolution setting on most scanners and they're just indicating as such.
------
benzofuran
This is an interesting read, but the author is also a political science
professor. If this article had been by a mathematics educator, it might be a
bit more persuasive.
I'd argue that algebra is fundamental - the excerpts taken involve the
quadratic equation and other supposedly tedious tasks, while ignoring that
these are foundations for higher level problem solving. In almost any field,
somebody will ask at some point "how many X do I need for Y" - and it's
usually not that clear.
I would almost posit that the author is trolling the NY Times, but today
that's probably not the case. While math is hard for some students, it's an
essential part of education for just about anybody that hopes to be a mildly
functional member of society.
------
Thrymr
I was forced to read _The Scarlett Letter_ in high school, and I have never
once had to use that knowledge in my professional life.
I memorized dates for history exams in high school, and that knowledge has had
no practical impact on my life whatsoever.
When introducing myself to (non-math/science) academics and telling them I was
a (now former) physics professor, I can't tell you how often the first thing
out of their mouth was something along the lines of "I was never very good at
math" or "I never liked science." I frequently have had doctors tell me how
much they hated physics as a pre-med student. That kind of pride in ignorance
is quite rare in the opposite direction. The most broadly educated people I
know are scientists.
[edit: typo fix]
~~~
Thrymr
As a followup, a nice blog essay on the theme of intellectual contempt for
math and science:
[http://scienceblogs.com/principles/2008/07/26/the-
innumeracy...](http://scienceblogs.com/principles/2008/07/26/the-innumeracy-
of-intellectual/)
I often wanted to call bullshit on those people too, and reply with something
like "I always hated reading" or "What use is Shakespeare, anyway?" Of course,
I never did, out of both social politeness and an actual respect for the
humanities and what they bring to civilization.
------
Tooluka
Yes it is.
You can easily swap solving math problems for writing essays in that article
and get the same conclusion. Also the whole article completely lacks any
statistic basis. 42% of students didn't pass their bachelor exams and 57%
students of one university (not saying which faculties were included in
statistics) didn't pass algebra course (one math course only is mentioned). So
what? How is that connected? What are the long term trends? What about other
courses and other universities? Same applies for all the article - throw
together random numbers that SEEM to be related (they may be, of course).
Very shallow and probably incorrect article.
~~~
nextstep
I agree with what you are saying, and I think algebra is an incredibly
important skill. However, the article touches on an underlying issue that I
think is true: the US education system does not prepare people for certain
jobs that are in high demand. Many jobs that could be learned at a vocational
or "trade" school don't require much high school math.
~~~
gersh
You need Algebra even for "trade" school. According to
<http://www.njatc.org/training/apprenticeship/index.aspx>, you need "One Year
of High School Algebra" to become an electrician. I think the same is true for
other trades.
------
tsahyt
This is a perfect example of how Betteridge's Law can fail. Yes, algebra is
necessary. Very much so. It's basically the first time students are required
to operate on a formal system, which requires strictness and analytical
thinking. As another poster has pointed out, algebra is a gym for your brain.
Also, algebra has tons and tons of applications. Anybody claiming they "never
needed to solve for x" has completely missed the point of symbols. They have
certainly solved for x but haven't realized it yet.
The article lists arguments which can be applied to about everything else you
learn in secondary education. How comes nobody ever complains about learning
literature, arts, music or whatever, but people seem to insist that they'll
never need math in their jobs, when in fact, math has made their lives
possible as they know it.
</rant>
------
photon137
"But there’s no evidence that being able to prove (x² + y²)² = (x² - y²)² +
(2xy)² leads to more credible political opinions or social analysis"
Nothing else leads to that either. Political opinions and social analysis have
zilch value in understanding the Universe we live in. Math does.
------
sreyaNotfilc
Hmm.. Algebra is not hard. Its actually pretty easy if you know what you're
looking for. Heck, the professor even tells you the answers before every test.
Just follow the steps.
That's the problem though. There are many steps to be learned. Things like
"order of operations" and "FOIL" are necessary to get the correct answer. Its
not that learning Algebra is hard, its the discipline that comes from the.
Trouble is, most young people don't see the rewards in learning this
discipline.
For example here's a few other disciplines that takes time but seems more
rewarding to a young student (or anyone)
Swimming - can have more fun at the beach Martial Arts - self defense and
confidence, and proof that you're a bad ass Learning a FPS - bragging rights,
more fun online against friends and other gamers Guitar Lessons - sing your
favorite songs, perform live, get lots of friends/admirers
Algebra - learn about long term goals and ... processes?
Kids are not into the "long term". They are into the short term. Thus, they
are kids. Adults needs to show them why it is important to learn how to find
"X". Unfortunately, most adults would rather stay far away from that stuff
because they've had a hard time with it as a child as well. Its painful to
them. Its easier to tell the child "because it is important" than explain why.
I personally think algebra is very cool. I like how things can workout just
because you know a proven formula for solving that problem. But then again,
that's why I (and other CS) make the big bucks. We do the things that no one
else wants to deal with.
Using your brain is hard. Solving a tough equation is probably equivalent to
running a mile. It can be done. It can be fun. Yet, you don't see many people
with long slender bodies running around everywhere do we?
Everyone has their niche. Its really up to them to find it.
------
irreverentbits
I find it challenging to believe that individuals incapable of basic, entry-
level algebra are really qualified to attend a university in the first place.
Having many good friends in disciplines which will never require the use of
any sort of mathematics, I can safely say that none of them have had trouble
with high-school level algebra.
Portions of the article strike me as a depressing appeal to entitlement
amongst the lowest common denominator of students to a university education.
------
stcredzero
True story. I was having trouble sleeping and I went to a big box home
improvement store for some custom cut roller blinds. I measured the blinds to
the nearest 1/8 inch so less light would leak around the edges. I grab the
blinds and go to the blind cutting machine, which has an arm marked off like a
ruler to 16ths of an inch, with the cutting blade held by a screw clamp. There
are no detents or grooves to force discrete measurements, just a screw clamp,
so this machine can do any length between its min and max length.
Well, the young woman who comes looks at the measurements I wrote down and
looks at me and says, "The machine doesn't do fractions."
I was floored. She was lying to cover her innumeracy, because _she_ didn't do
fractions. If this is indicative of a trend, it's bad news for the future of
the United States.
~~~
Steuard
More charitably, she wasn't lying, she was just ignorant (and therefore
wrong). An innumerate person might simply not recognize all those extra marks
as having any meaning at all, or at least not as having anything to do with
those dreaded "fractions" from school.
------
jayferd
Making a subject mandatory is a great recipe for sucking all the life out of
it, and mathematics is one of the best examples. Is algebra awesome?
Absolutely. Does every member of our society need to be able to factor
polynomials? ...not really, especially not at the expense of a child's natural
love of learning.
------
shin_lao
See algebra as the gym for your brain. Although you might not need algebra,
you will benefit from the exercises.
~~~
kalid
I think that's a dangerous line of reasoning. Why not learn to memorize
ingredients on a can of soup? That is also exercise for your brain.
Algebra lets us describe relationships between unknown quantities. Nearly
every physical law is expressed as an algebraic equation (or Calculus, which
requires it). It's the lingua franca of describing the world. That's why we
need it :).
~~~
shin_lao
Memorizing trains memory, that's not the same exercise.
------
s_baby
Understanding variables is a hallmark of the "formal operational" stage of
development.[1] It's the same reason why we hear some people "just don't get"
programming. Same reason why languages like LOGO avoid this construct.
If grasping Algebra is actually about attaining this developmental stage, we
need to be approaching the problem on a more fundamental level. Kids will move
through these stages at a different pace and if you're on the tail end of
developmental pace you're going to fall through the cracks.
[1]
[http://en.wikipedia.org/wiki/Piaget%27s_theory_of_cognitive_...](http://en.wikipedia.org/wiki/Piaget%27s_theory_of_cognitive_development#Formal_operational_stage)
~~~
kaiwetzel
I had totally forgotten about this :-(
Searching a little I found this interesting document addressing some of the
problems which I have put on my reading list: "The Science of Thinking, and
Science for Thinking: A Description of Cognitive Acceleration through Science
Education (CASE)" (Philip Adey, 1999).
Assuming the idea of a "formal operational" stage of development applies, the
situation looks abysmal (at least in the US and Germany, can't say much about
other countries):
From the little I gathered so far (on the internet, so it has to be taken with
a grain of salt) it seems that (1) a vast majority (over 60%) of people never
reach formal operational maturity. (2) Ideas of how teaching can actually help
with it are in it's infancy. (3) Application of said ideas is not very far
along. (4) Educational systems keep leaving many (or most) students behind
early, especially in math and science, while other students get bored and
waste their time in class, being taught a mind-choking curriculum.
Ok, I guess I'm ranting now :-) Saddens me greatly, though.
[1]
[http://www.ibe.unesco.org/fileadmin/user_upload/archive/publ...](http://www.ibe.unesco.org/fileadmin/user_upload/archive/publications/innodata/inno02.pdf)
edit: spelling, grammar
------
stcredzero
There was a recent discussion on reddit following some Khan Academy
controversy, where some education people mentioned some research about the
concept of slope.
It turns out that a significant number of people don't understand speed as a
rate. They think of it as an intensity, as in volume of sound or brightness.
(Those are flux, which are related to rate, though that's not how we perceive
them.)
People like that are alien to me. I think it also explains why freeway drivers
in Houston often have about 0.3 seconds of separation between cars.
------
yummyfajitas
Math literacy is valuable even for regular people.
Case in point - an innumerate relative of mine has been duped into selling
"Nu-skin" products for virtually no money. Nu-skin gave her documents
explicitly stating her odds - 99.5% of active nu-skin sellers make <
$15k/year. But they also showed her videos of people who won won carribean
cruises and made thousands/millions.
Guess which one she bought into?
A math literate person would recognize that her odds are better working at
chipotle and reinvesting some of the proceeds in vegas.
~~~
telemachos
I agree with your main claim about the value of mathematical literacy, but
your example is terrible. It's not really about math at all. It seems to be
about decision making and many people's tendency to choose very faint hopes,
no matter how well they understand (on some level) that the odds are massively
against them. I very much doubt that's about math _per se_. That is, your
relative didn't misunderstand how little 99.5 leaves from 100. She simply
isn't a logical machine.
tl;dr What a person recognizes and what a person acts on are _not_ always the
same.
~~~
yummyfajitas
Having actually had conversations with her about the matter, I can tell you
that the fundamental problem is she just doesn't get what 99.5% means.
A numerate person looks at things like this as a math problem. An innumerate
person doesn't.
~~~
UK-AL
It's hard not to understand 0.5% is not a lot. It's seems to me she knew
risks, but ignored them.
~~~
Someone
Not a lot? As odds in a lottery that pays millions to the winner (original
remark said thousands/millions, but that 'thousands' would seem less than the
<15K that the 99.5 would get) and still pays out to everybody, it looks
incredibly enticing.
The lie with these kinds of things is not as much in those percentages as it
is that it is not a lottery. There are people who can sell anything; they are
the winners. Also, typically, there is some kind of pyramid scheme involved.
You get stats on the early birds, but you cannot become an early bird
yourself.
~~~
UK-AL
When you talking income, you don't want high risk/high payout, because your
going to have serious cash flow problems, if you don't win, which the most
likely case.
A small chance of winning, but with otherwise serious cash problems? Or high
probability of a nice stable income?
I know what I would choose.
~~~
Someone
I know what I would choose, too, but I have an income. The target for such
schemes is people who don't have that, or have a really low one. For them, the
options look like "about what I get now, and I can choose my own working
hours, and no longer have to listen to a boss" and "bingo".
And that likely is true. If you choose any 20 hours to work each day, 7 days a
week and have some talent for sales, you likely will make that 15K.
------
evoxed
> Mathematics, both pure and applied, is integral to our civilization, whether
> the realm is aesthetic or electronic. But for most adults, it is more feared
> or revered than understood.
This is very true, but lowering expectations is hardly a solution. Ignorance
(in the form of oh-well-that's-the-class-smart-people-take) is no better than
fear. Before we set the bar too low, let's keep pushing to make mathematics
_less_ intimidating, _less_ foreign and easier to learn for everyone.
------
simonster
I agree that algebra in and of itself is not particularly useful. However, an
in-depth understanding of statistics is vital to basic scientific literacy and
most certainly "leads to more credible political opinions or social analysis.
I've often wondered why high schools don't place a bigger emphasis on
statistics, but I'm not sure if you can teach basic statistics without algebra
(or calculus).
~~~
Steuard
Most basic statistics classes that I've seen don't require calculus. My wife
(who's taught stats in college many times) claims that they don't _really_ use
algebra all that much, either (you aren't generally solving quadratics or
anything), but they certainly require the comfort with complicated symbols and
formulas that algebra classes teach.
Have a look at the TED talk linked in my top-level comment for an argument
along the lines of what you're making here.
------
ggwicz
I think algebra is necessary, and calculus, and geometry, and trigonometry,
and everything else I don't understand in the realm of mathematics.
But they're important in the context of real life.
So, yes, we should keep these classes, keep having kids go through algebra and
calculus and geometry...
...But you shouldn't be bound by arbitrary rules. Algebra is important, but if
you find "x" by doing something other than some arbitrary thing where you
subtract both sides, etc., you shouldn't get an F in the class.
Same with calculus, geometry, everything. The importance is the thinking and
the logic, and what real-life application you can take from your knowledge.
Making hard rules for these math courses, for example, definitely hurts this
and does some of the things this article claims.
But true, honest exploration in math and thinking about it is very important,
and if it's free and done in an honest way there is no question about whether
it's necessary or not.
Richard Feynman covered this better than I ever could:
<http://www.youtube.com/watch?v=5ZED4gITL28>
~~~
simcop2387
> ...But you shouldn't be bound by arbitrary rules. Algebra is important, but
> if you find "x" by doing something other than some arbitrary thing where you
> subtract both sides, etc., you shouldn't get an F in the class.
In middle school at least this kind of thing was allowed as long as we could
show why what we did worked. The idea being that we would have to understand
what we did in order to know either when it would work or when it would fail
so that we could apply it correctly. If we could do that we weren't given full
credit for anything because the homework/tests were meant to check that we
understood how to get the correct answer (other than copying from the nerds
like me).
------
jeffdavis
It's hard for me to comprehend that someone could understand any subtle (or
even not-so-subtle) distinctions or complex arguments without at least a basic
understanding of algebra and statistics.
I could be wrong though. What about famous thinkers like Jefferson or Lincoln
-- did they understand algebra and statistics at all?
~~~
Thrymr
Jefferson was quite well educated in science and mathematics for the day:
<http://www.math.virginia.edu/Jefferson/jefferson.htm>
Geometry was prominent in those days, but as a student he also learned physics
with Newton's _Principia_ and _Optiks_ as textbooks.
Lincoln was mostly self-educated, but I have no idea what his background in
math was.
------
jeffdavis
It seems likely to me that algebra during school is a cause for dropout not
because it's harder, but because it's more objective.
If someone doesn't know anything about history, you can pass them along,
claiming that they wrote a few essays or something. It's fraud, and any expert
can see it, but it's easy enough to ignore it if you try.
But with algebra you can't ignore failure. It's obvious. Even the most basic
tests will reveal ignorance quickly.
I conjecture that those people who drop out "because of algebra" are not
proficient in any subject.
I observe that the alternatives suggested are less objective and more "hands-
on". What does it mean to learn about the consumer price index without
learning basic algebra? The only explanation I can think of is that it offers
more opportunity to ignore educational fraud.
------
abtinf
How is this not an article from The Onion?
------
heycosmo
To the extent that algebra courses consist of solving # + x = # over and over
again, I agree that they are mostly useless. In life, algebraic questions
don't come at you in symbols and numbers, they come in words. And I would
argue that these questions surround us! It's just a matter of people
recognizing that they are there. Therefore, in my opinion, a good algebra
course (which is essential!) focuses on problem solving. Necessarily, the
course would involve some mindless equation solving to learn the framework of
algebra.
------
bickfordb
I'm biased since I'm a software engineer, but I believe algebra is extremely
valuable. I can't imagine not knowing it! If I were to revise math education I
would balance the amount of time spent on geometry (1 year HS), trig (1 year
HS) and calculus (1 year HS, 2.5 yrs undergrad) better with discrete math,
linear algebra, probability and statistics. The current system seems to be
disposed toward creating 50's NASA fodder and economists.
------
artlogic
I rarely comment, but I feel compelled to add to this discussion.
Algebra, as it is taught today in the U.S., is not necessary, and is probably
detrimental in many of the ways stated in the article. I can't speak for non-
traditional schools, but public schools, with their focus on standardized
testing, have effectively destroyed the original spirit behind teaching
mathematics.
Mathematics, much like other academic topics (e.g. literary analysis), was
taught not as a practical skill, but as a way to improve your abstract
thinking and problem solving skills. As another commenter stated, Mathematics
(along with most later academics) should be a gymnasium for your brain. I like
this metaphor, because if you think about solo athletics, all students are not
expected to achieve at a pre-defined level. Rather students are evaluated
based on improvement in performance over time.
Sadly, mathematics in recent years has become less about abstract thinking and
problem solving and more about rote memorization, computation, and
application. I taught basic algebra to college students for a semester. One of
my most interesting experiences was with the dreaded story problem. Most of
the students were simply unable to apply math to solve a problem. They were
all very good all computation, but when faced with a problem in a form they
didn't recognize they instantly began flailing.
Standardized testing has turned math into the process of recognizing a form,
plugging in the numbers, and computing. We're now starting to see the first
generation of math teachers that are a product of standardized testing, and
the results are frankly, frightening. I've spoken with younger math teachers
who couldn't explain the practical importance of their subject. While they
loved math, they couldn't tell me why they were teaching it, other than: "It's
on the [standardized] test."
This is the kind of Algebra we don't need.
* * *
Incidentally, the only way I avoided the shocking deficiencies of standard
high school "math" was by fighting my way into an advanced program at the
local University. It was there I was introduced to Euclidean geometry and
really learned what math was all about. I think everyone should learn Geometry
using the Euclidean method. I learned from a simplified book (Geometry, by
Moise/Downs) which starts out with a few more postulates than Euclid did. I'd
go as far to say that geometry (properly taught) will make you a better
programmer/problem solver/thinker.
------
kghose
Yes it is and we need to train teachers to teach it better.
------
serichsen
Yes.
------
alpine
I stopped reading the moment I encountered 'I say this as a writer and social
scientist...'
~~~
esrtbebtse
Do you believe that writers and social scientists have nothing to add to this
discussion? Not everyone is a scientist, not everyone is a mathematician, not
everyone is an expert in education.
Equally, not everyone's opinion is equally valid, but they too may have
something to add. In particular, find people with full and meaningful lives
who have _not_ done algebra, and that will show that it's not essential.
Showing them that their lives would be better _with_ a working knowledge of
algebra would be a challenge worth considering.
~~~
alpine
_Do you believe that writers and social scientists have nothing to add to this
discussion?_
Mostly, Yes.
------
moron
Telling that students struggling with a subject is considered evidence of a
problem with the subject. Couldn't be the teachers or the curriculum or the
students.
~~~
bluekeybox
In math, a good teacher goes further than in any other subject. For example,
in Eastern Europe, the teaching facilities has been very poor in the past
century, but it is ridiculous how many influential 20th century mathematicians
were Hungarian. I couldn't understand this phenomenon for a long time until I
realized that it nearly entirely has to do with the fact that most of those
mathematicians studied under eminent Austro-Hungarian mathematicians who
studied under eminent German mathematicians who studied under eminent French
or Italian mathematicians all the way back to the Renaissance. Look up Paul
Erdos (<http://genealogy.math.ndsu.nodak.edu/id.php?id=19470>) for example,
and keep clicking on "advisor" until you reach 15th century Italy.
This is why I love online learning and have good hopes for the Khan academy.
Projects like that can set us/our children free from the plague of bad
teaching.
| {
"pile_set_name": "HackerNews"
} |
Measurement of Higgs Boson Mass at √s = 7 and 8 TeV on ATLAS and CMS Experiments - based2
http://journals.aps.org/prl/abstract/10.1103/PhysRevLett.114.191803
======
AHHspiders
[http://arxiv.org/pdf/1503.07589.pdf](http://arxiv.org/pdf/1503.07589.pdf)
------
chmaynard
Note to based2: If you're going to post a link to a scientific article behind
a paywall, at least tell us why we should care about the result.
~~~
Osmium
I don't think it's behind a paywall; it's CC licensed. I can access it and I'm
not connected to an academic network at the moment.
On a broader note, there's a big push for open access at the moment (as there
should be), so there's no longer a need to assume that a paper is behind a
paywall just because it's on a journal's website anymore.
| {
"pile_set_name": "HackerNews"
} |
Show HN: Contentibl – Native iOS apps for WordPress sites - ryanworl
http://contentibl.com
======
crazychrome
trust me, it's not going to work.
tried similar thing on facebook page 2.5 years ago. the reason is that unless
you have a deep pocket, your non-revolutionary product has little hope.
besides, you'll have problems with App Store Review Guidelines 2.13 sooner or
later.
anyway, good luck!
| {
"pile_set_name": "HackerNews"
} |
7,500 Online Shoppers Unknowingly Sold Their Souls - willphipps
http://www.foxnews.com/scitech/2010/04/15/online-shoppers-unknowingly-sold-souls/
======
otakucode
Most people would take this as a simple joke, but I think there's far more to
it. Not the idea of 'soul ownership', but the fact that there exist
essentially NO consumer protections whatsoever when it comes to these "digital
licenses" that the gaming industry is increasingly using. With these licenses,
gamers are stripped of nearly every single right they would retain if they had
bought the game in retail form. It would be illegal in most countries for the
sellers of any retail product to restrict the rights of the consumer in the
way that game companies do to gamers. No seller can forbid you from selling
the product you buy to someone else. With game licenses, they do. No seller
can forbid you from using the product and then giving it to a friend. With
game licenses, they do. No seller can forbid you from allowing a friend to
borrow the item. With game licenses, they do. The list goes on, and is quite
long.
As more and more transactions take place involving 'licenses' instead of
transfer of traditional goods, we are losing a tremendous amount of freedom
with how we interact with and use our purchases. I think it would be a good
idea for people to start standing up and talking to their representatives
about the need for consumer protection laws in the arena of digital licenses.
The way it is going, we'll only end up with such things if the companies end
up crossing the line and doing something entirely legal that would infuriate
the general public to a great degree, such as Apple locking every iTunes
customer out of music they already bought a license for until they paid
another $1 per track to re-purchase access to it. That would be completely
legal and within their rights, and the consumers would have no legal grounds
to comaplin at all. They signed away their rights. In EVERY other area of
commerce, there are laws preventing consumers from even being capable of
signing away such rights, and preventing sellers from exploiting their
customers in such ways. Not digital media, though. There are no protections at
all when it comes to 'licenses.'
~~~
grellas
It is generally unlawful under common law in the U.S. to place absolute
restraints on what is called the "legal power of alienation" when someone
sells you property. In other words, once you acquire ownership of property
from someone, you normally are free to re-sell it or give it away or whatever.
The law will generally strike down attempts of seller to attach strings to the
grant so as to burden what you can do with it. Attempts by sellers to put
_absolute_ restraints of this type on their buyers are generally struck down
as being against public policy. The idea is that society strongly benefits
from having property be freely transferable and no one therefore should have
the power to tie it up by placing contractual restrictions in a grant by which
the property is transferred. A more extreme situation pertained as well under
the old system of inheritance, by which property was bequeathed with "fee
tail" transfers stipulating, e.g., that the property would forever remain in
the male line of the family (the subject of so many Jane Austen novels). That
system was eventually replace by a "fee simple" system of inheritance that
severely limited a decedent's power to tie up property after his death.
With licenses, the law gives the owner vast discretion in how to fashion a
license because it is treated as a limited grant that can be shaped almost
entirely by the terms of a contract. To date, courts and legislators have not
seen the need to place "public policy" restrictions on the way licenses are
fashioned and freedom of contract continues to rule unimpeded in this area. In
other words, if you agree to it, you are bound by it.
What the article here underscores, though, is that the very idea of a contract
is something of a fiction when it comes to downloads. That does not mean it is
not upheld in courts of law. The law very frequently uses fictions that make
it convenient for commercial transactions to occur. Thus, if one must "accept"
contractual terms of use in order to download a product by clicking on
something saying that he accepts them, and in reality almost no one reads the
stupid things, this doesn't mean that there hasn't been a meeting of the minds
such as to form a binding contract. While there has in reality been no
"meeting of the minds" because the one party has not bothered to read the
contract, the law presumes that any responsible person would do so and
therefore says that you are bound by the terms whether you in fact read them
or not. Hence, the fiction that there is a true meeting of the minds in such
cases. If it were not for such fictions, chaos would prevail and this form of
commerce would cease to exist. Since it benefits society to have online forms
of digital media distribution, the law supports the fiction to enable this
form of commerce to exist by protecting the interest of vendors of digital
media through the idea of binding contractual restrictions. That part is not
about to change even though most people do not in fact read the contractual
terms (as this piece very cleverly shows).
It is another matter altogether whether any given term that happens to be
included in such contracts is enforceable or not (even though the contract as
a whole may be). Of course, a clause that makes the contract literally a
Faustian bargain would not stand but most clauses will, including the ones of
which you complain. I don't think courts will normally take the initiative to
strike these down, but legislatures may be persuaded to do something if
significant social policy concerns can be highlighted and made compelling. As
indicated above, this has happened in the past in analogous areas of law.
~~~
lkijuhyghjm
Like when you thought you had bought a movie, but it turns out you don't have
the right to watch it on your computer, or in another country, or in your car,
or skip the trailers.
But strangely when the plastic disk breaks you don't get a new replacement one
- even though all you bought was a licence to watch the contents
------
jjs
They'd have a hard time getting them from U.S.-based customers: souls, being
human remains, cannot be sent by US Mail.
([http://improbable.com/airchives/paperair/volume6/v6i4/postal...](http://improbable.com/airchives/paperair/volume6/v6i4/postal-6-4.html))
~~~
roc
Only if we beg the questions of whether the soul is indelibly human _and_
whether it's physical.
~~~
jjs
To be "remains", it needn't be indelibly human, just originally human. (e.g.
cremated ashes).
I don't think the rule specified that it had to be _physical_ remains,
although that would certainly make it easier to package.
~~~
sliverstorm
Some niggling little thing in the back of my mind is saying human remains are
one of the few things explicitly prohibited by the US Postal Service.
~~~
jjs
_Some niggling little thing in the back of my mind is saying human remains are
one of the few things explicitly prohibited by the US Postal Service._
Like this, perhaps? <http://news.ycombinator.com/item?id=1270582>
~~~
sliverstorm
Huh. I hadn't seen that yet. Sorry for repeating what you said!
------
chaosmachine
<http://news.ycombinator.com/item?id=1270058>
~~~
willphipps
you guys are way ahead of me, man..
~~~
RiderOfGiraffes
Maybe, but your submission, 2 hours later, has got twice the karma but half
the comments. Go figure. It's all random.
~~~
chaosmachine
It's all about the headline. A good headline will generate twice the upvotes
in half the time. A lot of great stories never make it to the front page
because the headlines are terrible.
~~~
RiderOfGiraffes
And yet we're told not to editorialize, so I "wasn't allowed" to change the
headline.
Additionally, the other story was already on the front page when this one was
submitted.
Still, who cares.
------
RandolphCarter
Unfortunately dread Cthullu got my soul long ago, so I would have to opt out
of this particular licensing agreement.
Ia! Ia! Cthullu fhtagn!
:-)
------
hkuo
How do we know that these shoppers did not, in fact, knowingly submit their
souls to this company? Like, hey, why not? No biggie.
In all honesty, this just illustrates a very common UI practice that opt-outs
are more effective than opt-ins.
~~~
Estragon
The $5 voucher offered for opting out makes that explanation less plausible.
~~~
hkuo
Point is that users will more often ignore opt-outs than not, regardless
whether it is beneficial to them or not. As a UI practice, it's up to the
form's creator how to manipulate this behavior.
------
richardburton
That is fantastic link-bait for their site. Well worth giving a few hundred
customers a discount voucher.
------
aw3c2
___The terms of service were updated on April Fool's Day as a gag_ __
------
JMiao
this should have been in the dante's inferno eula.
------
alexkay
Obligatory xkcd comic: <http://xkcd.com/501/>
| {
"pile_set_name": "HackerNews"
} |
Microsoft buys AR headset patents for $100-150M – report - Aoyagi
http://www.totalxbox.com/74099/microsoft-buys-ar-headset-patents-for-100-150-million-report/
======
jessriedel
For those confused by the acronyms: A virtual reality (VR) headset would show
the viewer an image completely produced by the device. An augmented reality
(AR) headset allows you to see through the glasses at the actual world in
front of you, and just projects extra data or graphics onto of this. Think of
a heads-up display for a fighter pilot.
~~~
AndrewKemendo
As someone who is working on an AR project and has to constantly explain to
people WHAT AR is, I find it a tad disheartening that this is the top comment
in HN comments. I would have expected that the readership would have been well
versed enough that it would have been mid-level.
This does serve as a good data point about technology penetration though.
Thanks.
------
nealabq
This is more likely a response to Google Glass, not Oculus.
I assume Google has built a patent portfolio around Glass. This may even be a
defensive move by Microsoft. Still, looks like another tragedy of patent abuse
brewing.
I wonder about the upcoming patent lawsuits against Oculus. Maybe that's their
real reason for embracing FB -- protection.
~~~
sitkack
These patents look silly. I have done work in the AR/VR space and these
patents are chaff. Much of the stuff coming to market right now is not
patentable because this research goes back so far. Which is a _GREAT_ thing.
~~~
Excavator
Just in case, you have heard of Ask Patents prior art request 'section'¹?
There's a lot of 'unpatentable' stuff that gets through or almost gets through
the process.
¹ [http://patents.stackexchange.com/questions/tagged/prior-
art-...](http://patents.stackexchange.com/questions/tagged/prior-art-request)
~~~
sitkack
I am going to get a migraine and an ulcer reading those. I think filing bogus
patent claims should be an actionable offense.
------
zackmorris
Gah, it didn't occur to me that the reason the Oculus sale was so expensive
might have been patent-related. VR tech is not all that complicated. I mean
really, if protocols like HDMI and Thunderbolt weren't so complex, DRM and
patent laden themselves, how hard is it to put an image on two screens and
sync it to gyroscopes/ultrasonics/infrareds? This is college-senior-project
level complexity.
Software patents are such an obvious incumbrance to programmers, but I wonder
if we should think about the toll that hardware patents are taking on
innovation overall Maybe it’s time to take a stance against all patents.
I’m hopeful that rapid prototyping might alleviate some of this because
hardware could be open sourced and then people could assemble it themselves.
Unfortunately they probably won’t be able to pay someone else to do it. That
strikes me as fundamentally wrong and when the law doesn’t fit the public
interest, we’ll probably see black market and bootleg manufacturing appear.
We need 21st century solutions for intellectual property, when the vast
majority of manufacturing is automated and even innovation itself has been
augmented by machine learning. Otherwise I see the end of innovation and only
the biggest megacorps being able to afford to play.
~~~
paulbaumgart
Patents are a fine idea in principle. There's little doubt that they promote
innovation in pharmaceuticals, for example.
The problem arises when the patents last much longer than the "innovation
cycle" in an industry. As software takes over the world, we're seeing cycle
times go down in many fields, but patent laws aren't keeping up with
technological reality.
~~~
_delirium
Yes, shorter patents would address a decent number of the issues.
To pick one area I'm somewhat familiar with, a lot of the patents in digital-
audio synthesis were _initially_ reasonable, in my opinion. Furthermore, the
system largely "worked" as it was supposed to to incentivize innovation and
ensure that commercializers of technology paid some royalties to those
developing it. A research group like Stanford's CCRMA would develop a new
synthesis method, patent it, and license the patent to a commercializer such
as Yamaha, who put it in their synthesizers and paid a royalty. Without
patents, there would be a strong incentive to do this monetization via keeping
things secret instead, rather than openly publishing them somewhere that
Yamaha could just read.
The part where it starts seeming unreasonable is when a 15-year-old synthesis
method, which is by that time ancient and textbook-standard material, is still
patented.
~~~
anigbrowl
20 year doesn't seem like such an awful term for a patent. You think 15 years
is too long, no doubt there are other people who thin 10 or 5 are too long,
just as there must be some people who think it should be 25 or 30 years. I
feel like 20 years has stuck because it's approximately the length of one
human generation.
~~~
lugg
What on earth gives you the idea that a generation is a good measure of time
for a monopoly? Patents are intended to promote innovation for society not
give protection to a racket. The _only_ reason to give people a monopoly on
production is to allow those who invested in the invention and research to
reimburse the costs of research and development to allow for further
investment in innovation.
After the time period the patent / copy rights should be released to the
public domain so that everyone may benefit from cheap economy of scale
production.
The only reason I could think you feel this way is that it is all you've
known. In most cases a couple of years worth of profits from monopoly should
suffice. In reality it depends on how much was spent on the initial
innovation.
~~~
_delirium
_In most cases a couple of years worth of profits from monopoly should
suffice. In reality it depends on how much was spent on the initial
innovation._
One tricky issue is also the lead time to market. In the synthesis example,
Yamaha came out with a synth within ~2 years of licensing the patent. But in
some areas (like medicine) it can be common for it to take 5-10 years for a
product to come out, in which case a 5-year patent would be worthless, because
it would expire before you can sell anything.
~~~
lugg
Sorry I should have probably continued on, there was a bit of an implication
here:
In reality it depends on how much was spent on the initial innovation [ _and
how long it takes to reap the return on that investment_ ].
------
higherpurpose
I think AR is at least 10 years behind VR in terms of mainstream adoption or
"usefulness". VR is almost there (I think with 4k resolution, it will be) to
provide incredible advantages over what we have now. AR on the other hand will
most likely be mainly a gimmick for the next 10 years or so, even if they can
show some pretty cool demos initially. Take Microsoft's Illumiroom for example
which is "sort of AR". The immersion is much greater in VR than what
Illumiroom offers, and I'd much rather be "present" in a VR world, than see
some light show on my walls, at much poorer quality and much lower realism.
~~~
kybernetikos
I've never been in an illumiroom, but I have been in a CAVE like environment
with 3d projected walls. Pressing a button on the motion tracked handset and
seeing (and hearing) a lightsabre blade come out of it was by far the coolest
technology mediated experience I've had. I've got a Rift, but I don't think
anything it offers will be as good as that.
------
nestlequ1k
How stupid. Microsoft pays a bunch of people who invented a bunch of ideas
that are now restricted from being used.
What a joke the patent system is. We're paying people to innovate ideas, and
prevent any product with those ideas from coming to market.
Patent holders should have to pay a substantial tax / fee every year to keep
their patents. If these ideas are so valuable, they should be building
products and benefiting society with them. Not stuffing them under the
mattress and waiting to sue some poor sap who actually wants to ship
something.
------
gfodor
I think the comments in this thread talking about AR and VR as if they are
different are misplaced -- AR and VR will converge:
[http://willsteptoe.com/post/66968953089/ar-rift-
part-1](http://willsteptoe.com/post/66968953089/ar-rift-part-1)
------
izzydata
I can't wait for AR similar to the show Dennou Coil to become a real. It seems
potentially feasible to be a real thing even if at the time the show was made
it probably seemed impossible.
------
gum_ina_package
Very interesting development. Personally I don't see AR being anywhere close
to VR in terms of usefulness, polish, and mass market appeal. VR obviously is
going to be a major player in the gaming/entertainment industries, while an AR
headset is more of a general use device for everyone. It'll be interesting to
see what MS comes out with.
~~~
kybernetikos
AR is potentially much more useful, but VR is much more likely to lead to
useful products in the near future. VR is hard, but AR has all the same
problems of VR plus a whole bunch more.
~~~
nealabq
AR will no doubt be helped along by VR tech. But I'd guess mass-market AR will
first appear in vehicle and building windows.
~~~
cinquemb
Some AR is already present in some vehicles. My mother has a "HUD" that
reflects off the glass that displays speed, and gas consumption in the direct
LOS while driving, but not intrusive enough to distract, mid 2000's car.
But if I abstract what you are saying a bit, I would have to agree that AR is
going to be an "easier" sell in b2b than b2c, it has been for a while. I think
the cross over will happen when consumers are more exposed to the ways
businesses leverage it.
Here's a scenario I can see before mainstream adoption: Imagine a sales clerk
or rep with it as you walk into the store to pull in data from across the web
about you from facial recog and be able to automatically point you to where
you might be interested, maybe even using
google's/microsoft's/facebook's/yelp's/some startup's "intent" api that pulls
in your latest queries that may be relevant to the store you just we're
identified in.
Then some people see such things used, and then want the same capabilities as
they navigate their cities around them seeing that it could be useful in their
life.
~~~
masklinn
> Some AR is already present in some vehicles.
If HUDs are considered AR (a valid viewpoint, as far as I'm concerned) then
they've been in warplanes since WW2.
I don't see AR getting mass customer adoption before it can be directly
grafted.
> Imagine a sales clerk or rep with it as you walk into the store to pull in
> data from across the web about you from facial recog and be able to
> automatically point you to where you might be interested
That sounds horrifying. And unlikely, that customers efficiently go through
their purchases in the least possible time is not really in most store's
interest.
~~~
cinquemb
> _That sounds horrifying._
Well, the technology is here, the behavioral patterns in the way people use
services are here, and with companies like foursquare[0], who are basically
doing the same thing (minus the facial recog, which the tech is present for,
but the challenge now is connecting disparate data sets available which can be
leveraged at scale for general purposes [which I'm working on], and not being
shackled to things large companies typically are [we don't have user accounts,
so we don't have "users" to appease, and we don't have $X billion in revenue
so we have to take chances/experiment in order to grow]), this will be a
growing market where consumers will be exposed to such technology and its
benefits (and drawbacks because not everything is rose colored through my
lenses).
My startup is working in the periphery of this area now, so I can't say that I
don't have financial/technical interest involved with this. From the issues
that we get notified now about our current product is basically people are
afraid because they don't have "control" over "their" data and don't
understand how others are able to technically leverage it (or anything
technical about how the internet works, besides the profitable skinner boxes
that make up most consumer tech companies). People make baseless legal threats
all the time against us, but it is always interesting how it is always
complaints about them as the individual and not about for others; those same
people are happy to observe/give up information about other people.
> _And unlikely, that customers efficiently go through their purchases in the
> least possible time is not really in most store 's interest._
Valid point for some stores, but I was thinking about this from the
perspective of the store rep trying to establish some kind of human touch to
the shopping experience more so than they do now, while leveraging the data
they have so they can more efficiently decide who they should focus their
efforts on rather than the stereotypes made from their experiences. I guess
what I'm saying is that it may not be as clear cut as what some make it out to
be, and in that ambiguity lies the opportunity.
[0] [http://techcrunch.com/2014/03/05/foursquare-
revenues/](http://techcrunch.com/2014/03/05/foursquare-revenues/)
~~~
masklinn
> Valid point for some stores, but I was thinking about this from the
> perspective of the store rep trying to establish some kind of human touch to
> the shopping experience
I don't think this[0] is a "human touch". A creepy stalker touch maybe. And
your scheme would go even further.
[0] [http://www.nytimes.com/2012/02/19/magazine/shopping-
habits.h...](http://www.nytimes.com/2012/02/19/magazine/shopping-habits.html)
~~~
cinquemb
>[...] _creepy stalker_ [...]
Which are attributes usually assigned to human beings, no? Or do you enjoy
being trailed by officers/staff when you shop because of the color of your
skin, or do you not have to face nor think about such things when you shop or
in other interactions in your life? Because I don't particularly enjoy that
either, and I'm doing what I can to address it among other issues assuming
that if others could leverage information that is already out there about me,
maybe, just maybe I won't have to be treated as subhuman by some on initial
interactions, or at least while I patronize a store on occasion. What are you
doing to address the problems you have surrounding the use of technology in
such ways beyond vocalizing your displeasure? Do you still use social
networking, webmail, play apps via smartphone, purchase via credit cards
online/offline, etc…? If so, your behaviors are telling others otherwise.
I'm just stating that's the direction things are going in now, and for anyone
to avoid such realities means that they shall continue to be suspended in a
state of cognitive dissonance. Hardly just my "scheme", I'm just a piece in
the puzzle that was already being built before I was even born.
------
ChuckMcM
That is interesting, I wonder what patents or provisionals CastAR has in the
pipeline.
------
programminggeek
Not surprising. If you had the billions that Microsoft does, you'd invest in
patents too. At this point it's the software equivalent of buying out the
supply chain like Apple famously does with hardware components.
------
baby
I only thought about this for their Xbox division, until I saw the comments in
here.
But the idea of glasses that shows you ammo count, health and other infos
while your playing. Basically a HUD on your face. is really appealing to me.
------
rasur
Meta are the company to watch in the AR space.
------
shmerl
MS and patents? A very bad mix.
------
rikacomet
Am I the only one who saw the UPGRADE NOW banner ? despite Adblock?
------
nsnick
This website breaks mobile safari. The bar at the bottom does not appear when
scrolling up.
------
fabiofzero
It's 2014 and Microsoft keeps on being the "me too" of consumer technology.
------
rch
I love how the real MS shows up from time to time to remind us that all the
'new openness' and fair play talk is exactly that, and probably always will
be.
~~~
gnoway
I tend to agree with the sentiment, but exactly what ill-intent does acquiring
patents actually demonstrate?
~~~
rch
That's a fair question, and maybe I am jumping the gun with this particular
set of patents. But my interpretation is in context with other patent
licensing news (March 26th, 2014):
[http://mobile.reuters.com/article/idUSBREA2P1VC20140326?irpc...](http://mobile.reuters.com/article/idUSBREA2P1VC20140326?irpc=932)
So I was primed to read this as opening a new front in an ongoing,
contemporary effort to use patents aggressively. Maybe the downvotes are a
clear signal that I'm destined to be proven wrong in this regard. I guess
we'll see.
------
tzaman
Microsoft, as usual, following other's footsteps. Kinda sad since they used to
lead the game.
~~~
Shorel
> Kinda sad since they used to lead the game.
May be when they first made their Basic interpreter. And Word for Windows.
But everything else was an acquisition or a clone. DOS bought, Windows cloned
from Xerox and Mac, Excel cloned from Lotus-123, FoxPro bought, XBOX cloned
from playstation, Internet Explorer bought.
~~~
kenjackson
And iPhone closed from the Palm. And the iPod cloned from the Archos Rockbox.
And Linux of course cloned from Unix. And the Mac cloned from Xerox. And the
iPad cloned from tablet PCs. And the PlayStation cloned from Nintendo.
Nintendo cloned from Atari 2600. Atari 2600 cloned from board games.
BTW, Word cloned from Word Perfect. BASIC interpreter cloned from all the
previous BASIC interpreters that existed.
Being first or "leading" don't matter. It's about execution and a bit of luck.
Look at Android. It sure as heck wasn't first, but they executed. Windows
wasn't first, but they executed.
| {
"pile_set_name": "HackerNews"
} |
Quake III Arena on Github - rl1987
https://github.com/id-Software/Quake-III-Arena
======
mappum
Cool, but why is this news? Hasn't it been up for a while now?
| {
"pile_set_name": "HackerNews"
} |
How exercise changes our DNA - walterbell
http://mobile.nytimes.com/blogs/well/2014/12/17/how-exercise-changes-our-dna/
======
dang
[https://hn.algolia.com/?q=exercise+dna#!/story/forever/0/exe...](https://hn.algolia.com/?q=exercise+dna#!/story/forever/0/exercise%20dna)
| {
"pile_set_name": "HackerNews"
} |
The condom of the future is coming – with support from Charlie Sheen - morehuman
http://www.thememo.com/2016/08/18/lelo-hex-condom-lelo-charlie-sheen-condom-hexagon-condoms-lelo-safe-sex/
======
evans99
Great story....and who doesn't love Chucky!
| {
"pile_set_name": "HackerNews"
} |
Simon Willison: Django | Multiple Databases - mnemonik
http://simonwillison.net/2009/Dec/22/django/
======
moe
Reading through that documentation page I must say I'm extremely underwhelmed.
Is that doc-page outdated/incomplete or did it really take them 4 years and a
21kloc patch to tie a model to a database instance? I mean, I don't see
_anything_ about proper partitioning/sharding, not a glimpse of cross-db
queries. Is this a joke?
SqlAlchemy has had proper multi-db support (vertical and horizontal) since
around 2007. Perhaps the time would be better spent porting django to a real
ORM?
[http://www.sqlalchemy.org/docs/06/session.html#partitioning-...](http://www.sqlalchemy.org/docs/06/session.html#partitioning-
strategies)
~~~
simonw
The original plan was to add higher level support for certain types of
sharding (e.g. tying a specific model to an individual database) and earlier
versions of the code actually provided a method for doing so. This was removed
because it wasn't judged to be a good enough solution - Django is very keen on
the concept of reusable applications, and hard coding a decision of which
database should be used for an application's models in to that application
would conflict with that concept.
Obviously the solution is to specify a mapping somewhere, but there's no need
to hold up the feature while waiting for a design for that particular
bikeshed.
Instead, the plan is to release an initial version with low level primitives:
the ability to configure multiple databases and specify which database should
be used for a given query. This gives us a chance to see what kind of patterns
are most widely used and figure out the correct way to support those at a
higher level of the API.
In my opinion, the SQLAlchemy example demonstrates why handling sharding
automatically is a distinctly non-trivial problem:
[http://www.sqlalchemy.org/trac/browser/sqlalchemy/trunk/exam...](http://www.sqlalchemy.org/trac/browser/sqlalchemy/trunk/examples/sharding/attribute_shard.py#L129)
\- it's not clear to me that that's a better solution than just manually
picking the database to execute a query on at the point of execution.
~~~
moe
Well, SA provides all the primitives required for setting proper sharding up -
admittedly it could use some more documentation and an out-of-the-box
anonymous sharding lib. The example is fully functional, though, and once
you've wrapped your head around it, it all makes sense and does indeed work
(I've used it in production).
Sorry, but "seeing what patterns are most widely used" is a lame excuse. You
could just look at what real world sites do today (and did 4 years ago) - or
simply jump right to implementing the only anonymous sharding model that can
possibly work (see how mongodb does it).
I'm not meaning to bash django as a whole here, but I had a true WTF-moment
looking at that page. It seems the home-grown ORM is becoming a ball on a
chain if something like that takes _this_ long...
Anyways, at least it's progress in the right direction, looking forward to
real partitioning support, hopefully in earlier than another 4 years.
------
simonw
I submitted a direct link to the documentation over here, which is more
interesting than my blog post:
<http://news.ycombinator.com/item?id=1010373>
------
barnaby
21,000 line changeset... does that sound right? I mean, we live in the era of
distributed version control, in times when merging is cheap, shouldn't there
have been regular 2-way synchronizations?
~~~
simonw
There were. The work was carried out in a svn branch (with actual development
in Git, but the branch was updated so svn users could still see what was going
on). The svn branch was frequently updated to trunk using git-svn. Once the
feature was ready, an svn merge was performed resulting in the monster
changeset.
Django uses svn for the core repository, but the majority of the actual work
takes place in git or mercurial. Django has committers using both.
~~~
kingkilr
Actually the merge wasn't done using SVN. Russ didn't want to deal with
svnmerge, so he just took the diff from git and applied and committed it.
~~~
simonw
I stand corrected.
| {
"pile_set_name": "HackerNews"
} |
Bruce Schneier: Why the NSA's Defense of Mass Data Collection Makes No Sense - trauco
http://www.theatlantic.com/politics/archive/2013/10/why-the-nsas-defense-of-mass-data-collection-makes-no-sense/280715/
======
diydsp
Some of the clearest articulation on the matter I have ever read. I'm proud to
see Bruce in such a high-profile magazine.
"Third, this assertion leads to absurd conclusions. Mandatory cameras in
bedrooms could become okay, as long as there were rules governing when the
government could look at the recordings."
~~~
devx
I also thought his explanation for why this stuff is unconstitutional, and
more importantly, _immoral_ , was one of the clearest I've read online. I hope
he gets called to Congress hearings so they hear him say the same things
there, too.
| {
"pile_set_name": "HackerNews"
} |
On the High Side: iTunes Store Likely to Skew Towards $1.29 - ALee
http://www.digitalmusicnews.com/stories/032709itunes129
======
jleyank
I think this is mistake, as raising costs will not go down well today. It's
possible that I'll save a few bucks in that my tastes aren't current, but I
don't think this is the correct move during "lean economic times".
People will probably claim this will increase piracy. I suspect anything can
be used as that excuse...
| {
"pile_set_name": "HackerNews"
} |
Recursive Make Considered Harmful - JoshTriplett
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.20.2572&rep=rep1&type=pdf
======
hbogert
Love that paper, though the better-approach section was not as pragmatically
explained as the problem's section.
I'll raise a glass to him as soon as I'm having a drink with fellow hackers
somewhere.
------
JoshTriplett
Rediscovered via
[https://www.debian.org/News/2014/20141020](https://www.debian.org/News/2014/20141020)
.
| {
"pile_set_name": "HackerNews"
} |
Hacker Hostel – Jamaica - akua_walters
http://hackerhos-tel.webflow.io/about
======
akua_walters
I'm Akua. My team here at Startup Robot and I are hosting our inaugural Hacker
Hostel. We are taking students from top colleges and Universities around the
US and taking hem to Jamaica for two months, we cover their accommodation ( in
Montego Bay, Kingston, Portland, Ocho Rios, and Negril) & we pay them a
stipend and we only ask they pay their airfare and program fee.
| {
"pile_set_name": "HackerNews"
} |
Making Debian Responsible For Its Actions - alexkay
http://sheddingbikes.com/posts/1285659877.html
======
davidw
I didn't see the words "bug report" anywhere there. Debian is a volunteer
organization, open to anyone. If you want to improve it, sign up and help
instead of just ranting about crazy paranoid theories like:
> It's simply a tactic to make sure that you are stuck on Debian.
Of course I'm biased, as I'm a former Debian maintainer, but I think that
Zed's connection with the reality of the situation is tenuous, at best, in
this case.
Making a distribution is difficult, and doing so with 1000's of volunteers who
are not working on it full time (and since you're not paying for it, they
don't really owe you anything) and are distributed throughout the world adds
to the difficulty, so yes, there are bugs and problems and challenges to
overcome. That said, if each author of each package in Debian got their way
about the exact location of files and so forth, the system would be utter
chaos. If you think your package isn't being treated right in Debian, get on
the mailing list, file a bug report, make your case, and get things fixed,
rather than treating Debian as "the enemy"... Sheez.
~~~
JoachimSchipper
There is a reason that Debian is being singled out, which you ignore - Debian
likes to add lots of distribution-specific patches to _everything_ they
package. Other OSes/distributions are not as problematic, because they are
much more likely to ship software as the author wrote it (unless it's
ridiculously broken _and_ unmaintained _and_ important.)
Also: why should software authors have to write bug reports to Debian?
~~~
davidw
With 1000's of packages, and 1000's of maintainers, saying that "Debian" likes
to patch "everything" is a pretty big claim.
A lot of the patches that do go in make whatever package integrate better with
the rest of the system.
> Also: why should software authors have to write bug reports to Debian?
Because there's a bug?
> "Even though the piece of software I asked aptitude to install requires
> these libraries to run, they refuse to install it."
At first glance, that sounds like a packaging bug to me, with Rubygems, not
with one of Zed's packages. The best thing to do in those cases is report the
bug, rather than attempt to "make Debian pay" (for the thousands and thousands
of hours of work to give you a free operating system, presumably?)
~~~
heresy
Why should the author report a bug when it's the incompetence of the Debian
maintainer?
The 'gem' command not working after installing the 'rubygems' package does
seem like a bit of a bug, to put it mildly, you'd think a cursory test by the
maintainer would catch that.
~~~
regularfry
As far as I'm concerned, having rubygems available _at all_ in Debian is a
bug. Gems should be repackaged as .debs.
~~~
prodigal_erik
This. The point of Gems and Maven and the CPAN client and all this other crap
is to smuggle code into a production box while carelessly failing to get it
into the platform's database of what's installed where and what depends on it.
They replace one tool that knows everything that's going on, with many tools
each of which can't.
Somehow we got stuck with a critical mass of small-scale developers who never
faced the nightmare of somehow keeping the right tarballs on hundreds of
machines in different roles without good tools, so now we all get to relive
it.
~~~
cemerick
That's hilarious. Users of Maven and CPAN represent a "critical mass of small-
scale developers"? Holy shit.
A similar idea was voiced a few days ago, specifically regarding Java
libraries ([http://fnords.wordpress.com/2010/09/24/the-real-problem-
with...](http://fnords.wordpress.com/2010/09/24/the-real-problem-with-java-in-
linux-distros/)).
My response remains the same: Debian (or what-have-you) is but one distro
among many, and those of us that treat such things as important but
fundamentally commodity deployment platforms have no wish to spend our lives
babysitting whatever package manager happens to be on the operating system
we're using that month/year/whatever. Further, plenty of people have plenty of
good reasons for deploying on Windows and OS X, so using tools that work well
there make sense.
The priorities of the various linux distros aren't everyone else's priorities
– which somehow seems to come as a galloping shock to some.
~~~
billswift
That is why I use Slackware. Installing new software by hand from tarballs is
a bit tedious compared to package managers, but it is transparent, and when I
am done I can be pretty sure it is going to work. I have tried other distros,
even Debian once (for some reason the Debian wouldn't let me sign in as root,
I never did figure out why), but I keep coming back to Slackware - for its
transparency.
~~~
koenigdavidmj
And Slackware's packaging is light enough that it is fairly simple to write
wrappers around gems/CPAN modules/Python eggs. The latter two have prewritten
SlackBuilds that you can use, assuming that the setup.py is not doing anything
too shady.
------
thristian
> But here's the problem, Debian package maintainers don't want to give up
> control to the responsible parties. I would more than gladly make my own
> .deb packages, but they refuse to let me. In fact, I plan on making packages
> for the major Unices in order to head them off. That's what everyone ends up
> doing.
As a user, and amateur administrator of my home machines, I've learned the
hard way that third-party packages supplied by the original software vendor
are, in general, utter crap. They're built against a distro two or three
versions old; or they're built for Mandrake or Ubuntu instead of Red Hat or
Debian; they "install" a tarball to /tmp which the post-installation script
untars in the root directory; they don't have any dependencies declared at all
but crash at startup if you don't have a particular version of libjpeg... if
you're relying on the packaging system to detect file conflicts or outdated
dependencies, third-party packages can be very, very scary.
The single biggest reason I choose Debian (and sometimes Ubuntu) is the
uniformly high-quality packaging. Zed's found one problem package, and a trawl
through Debian's bug tracker will no doubt find others, but the fact of the
matter is that with a single command I can install almost any Free software in
the world, and have it installed and set up within seconds - and Debian has
years of experience figuring out how to make that work. I don't think that's a
thing to be dismissed lightly.
~~~
bad_user
I used Debian on the desktop for 2 years (2008-2009) ... the whole time the
Iceweasel package (their repackaged Firefox) has had its renderer broken,
making some web pages unusable for me ... this wasn't a problem if I manually
downloaded and installed Firefox, and it finally got fixed just before I
switched to Ubuntu.
It does have the best software repository ever, but boy how much it sucks when
something's broken.
------
blasdel
This is in no way specific to Ruby — if you look at the way any sufficiently
advanced software you're familiar with is packaged in Debian, you'll find it
to be completely fucked. Ancient versions, nonstandard configuration files,
random things disabled at compile-time (often for ideological reasons), files
scattered everywhere with the new locations hardcoded, with basic features
broken into separate packages. My favorite is the random patches, which when
they aren't in the service of the aforementioned ridiculousness, are mostly
cherry-picked from current upstream versions to 'fix bugs' without
accidentally introducing features because they're afraid of new version
numbers. When a patch doesn't fit those categories you really have to worry,
because now they're _helping_ (see OpenSSL)
The result is that any program or library that you use directly must be
sourced from upstream, _especially_ if it's less than 15 years old or written
in a language other than C or C++. Luckily pretty much all of the modern
programming language environments have evolved to cope with this onanistic
clusterfuck.
Haskell has more fucked by Debian than any other language I know of — when I
last had to deal with it a year ago there were two broken+old builds of GHC
with different package names and mutually-exclusive sets of packages that
depended on them. On top of that the version of cabal (Haskell's packaging
tool) in the repository was so far out of date that you couldn't use it to
build anything remotely recent (including useful versions of itself), nor
could you use it with anything in Hackage (the central repo).
My old roommate had listened to me bitch about this stuff for years, and
always dismissed me as crazy for thinking that the packaging was fucked
(though he did share my hate of _debian-legal_ ). Last week he called me out
of the blue and apologized — he'd installed Wordpress through Debian and
they'd broken it up into a bunch of library packages, but still left a base
skeleton of random php files and symlinks, accomplishing nothing but breakage
and unportability.
------
btilly
Here is my beef with Debian.
A lot of the software they package comes with unit tests. Those unit tests
have a purpose. They are meant to see whether or not the software as
configured and installed, works.
Debian systematically strips those unit tests out, and _never_ runs them to
see how much stuff they are shipping already broken. Why? Why not package the
unit tests as an optional package, and make sure they have a wide variety of
systems in different configurations running them to notice problems?
I can't count how many times I've tried to install a Perl module from CPAN,
found it was failing its unit tests, installed the Debian package with it, ran
the unit tests, and found that the unit tests said the package, as installed,
was broken. It's not as if the package developer didn't try to tell you you
were missing something. Why throw that information away?
If they did this then they'd automatically catch a lot of problems that
currently get missed. Heck, insert a test for ruby gems that says, "Does this
software start?" They'd have caught this bug automatically.
Until Debian catches up with standard best practices, I completely can't agree
with the meme that they run software better than everyone else. It isn't as if
unit testing is a new idea. It has been pretty mainstream for most of the last
decade. Some were doing it earlier. Perl has been doing it since the 1980s.
~~~
__david__
I don't think it's so cut and dry. In the ideal case, the packages don't need
tests because they are going into the system already tested. _If_ the package
is correct then running the unit tests is just redundant work because you are
getting an identical copy of a package that already tested good with the
appropriate dependencies. I think that's the theory anyway. On the other hand,
theory does not always match reality.
Debian CPAN packages _do_ run the unit tests--they just run them when you
create the package and not when you install the package. In fact, the package
will not build unless the tests pass.
~~~
btilly
Sorry, but it really is that cut and dry. In the ideal case you don't need
unit tests. In the real world, they are useful.
One of the big things that unit tests are supposed to catch are hidden
dependencies. By nature, hidden dependencies are likely to be installed on the
maintainer's machine, so unit tests only help if they are run somewhere else.
As a prime example, the bug that Zed was complaining about was a dependency
problem of EXACTLY this sort. It worked fine on the maintainer's machine. It
didn't work on Zed's machine because he didn't have a key library installed.
No unit test the maintainer can run would catch this. Even a stupid unit test
Zed can run would have flagged this immediately.
Incidentally about the Debian CPAN packages, my experience with them a few
years back was that they were so consistently broken (usually because of
locale issues) that I have a hard time believing that unit tests had actually
been run by anyone. Or if they were, they were ignored.
~~~
__david__
Well, then maybe what they really need is a single server/farm out there that
just goes through all the perl/ruby/whatever modules and installs them and
runs their unit tests. That way you get the same coverage but you don't waste
my time. I, personally, don't want a bunch of unit tests to run when I install
stuff--it's slow enough as it is and it's (usually) completely redundant.
It seems the same to me as slackware or one of the other "compile everything
yourself" distros--they force you (the installer) to do a bunch of redundant
calculations. If someone can compile on my architecture then why should I
_ever_ have to compile it? The same with unit tests--if someone can run the
unit tests on my exact configuration then why should I _ever_ have to run
them?
~~~
btilly
You still don't get it.
If you're installing item after item, then the odds are that at some point
you're going to install the dependencies, and after that the tests will pass
and mean nothing. So you keep on wiping, installing one thing, and testing.
Fine. Now you have cases where package A conflicts with package B and you
never find it. And now what? The number of combinations to test is incredibly
large, and nobody is likely to have tested your exact system.
As for the unit tests, make them available, make them optional. That's OK by
me. I'd opt to use them. Particularly since I learned the hard way that most
of what I want in CPAN either isn't in Debian, or isn't up to date there. So I
install things directly from CPAN. That means that when I install from Debian,
there is a possibility of library conflict that I want to notice.
~~~
caf
Perhaps a compromise would be to have the package-install-time unit tests run
in "testing" and "unstable", but be turned off for "stable"?
~~~
btilly
That would be worlds better than what we now have. But I'd like the option to
turn it on. (And I'm sure others want the option to turn it off.)
~~~
phaylon
I'm not sure what the -dbg packages in Debian/Ubuntu are strictly for, but
wouldn't this be a case for it? If one could post-run the tests by saying
"apttest Foo" or something along those things, integration into utilities to
automatically install test packages and run them shouldn't be a huge hurdle.
~~~
joeyo
I can't speak as to whether or not it would be a good idea for -dbg packages
to run unit tests, but their nominal purpose is that they contain extra
symbols to allow for debugging.
<http://wiki.debian.org/DebugPackage>
------
ComputerGuru
...and people ask me why I write tons of freeware, but make so little of it
open source.
The reason is simple: control. If I can't control every stage of the
development, deployment, and distribution process, I don't want in (yes, I'm a
control freak. No, I don't think necessarily a blemish on my personality, it's
just who I am). If there's something wrong with how my users perceive my
software, it's because of something _I_ did wrong, not because someone took my
hard work and toil and perverted it with their own changes, be it making the
code ugly with nasty function names or dirty hacks (in my opinion, of course)
or if they distribute it in a way that makes users cringe. It's my hard work,
and I deserve to be (a) in control of the user experience, and (b) attributed.
If you're willing to make your awesome utility/code that you've spent 5 years
developing and maintaining fully available to the public, giving up all
control of the end-users' perception of the package, you have a bigger heart
than me. But me, I'm a selfish guy when it comes to my users, and I don't want
anyone to even have the possibility of hurting them. I have _my_ users' best
interest at heart, you probably don't. At least, not to the same extent that I
would.
~~~
VMG
You could still make it open source but craft the license in a way to prevent
redistribution
~~~
davidw
> You could still make it open source but craft the license in a way to
> prevent redistribution
Then it would not be open source.
~~~
narag
There are certain clauses that don't affect the definition of a program as
open source, but would prevent it to be included by, let's say, Debian. I'm
thinking on the old BSD license or the branding problem of Firefox.
~~~
davidw
Debian includes the Firefox code, though. You can certainly say "you can
redistribute my code, but not call it XYZ if you've modified it", that still
lies within the realm of open source.
~~~
narag
That was my point. It's open source and still it's not distributed under the
Firefox brand, so users won't fill bugs in Mozilla's bugtrack.
------
mycroftiv
I sympathize with frustration at poor software packaging, but the proposed
solution seems completely disproportionate to the offense. I also don't see
any real evidence presented for the assertion that Debian's policies are based
on a corrupt motive. What OS is actually pure and innocent of technical flaws
and bad policies? If I recall, openSuse is in bed with Microsoft, Ubuntu fails
to contribute code to important projects, Fedora is really just Red Hat's beta
testing, etc. One of the pains of open source software is that things you
write will probably be messed up in lots of ways by other people. If seeing
software packaged or modified in ways you don't like makes you angry and
advocate retaliation, is that really consistent with the philosophy of a free
software license?
~~~
blasdel
The efforts of both Zed and Debian are based on the same corrupt motive:
_self-satisfaction_
As the upstream developer, Zed wants to make sure that his software is
distributed in a manner that is useful to people and actually works and shit.
This gives him jollies.
As the distributors, the Debian contributors want to fullfil ideological
goals, fit everything into their neat hierarchy, assuage neckbeard sysadmins
that the version number is sufficiently ancient (not that the software is,
given the distro patches), pretend to care about people running it on the PA-
RISC port of Debian GNU/kFreeBSD, wank about licensing, and reach consensus.
This gives self-important teenagers and manchildren all kinds of joillies.
~~~
Mod_daniel
I can't believe this getting upvoted. I wonder how much software you use
_every single day_ thats created and maintained by "self-important teenagers
and manchildren".
------
poet
I've always thought that it was a bad idea for Linux distros to package any
language libraries at all. It seems like a lot of repeated work and your users
will probably end up needing to install things manually in the end. Things
will require the lastest version of gems, new and essential features will be
added, etc. Just give your users a manual on how to install gems (or pip, or
whatever) and let it be. The users of language libraries are exclusively
programmers after all; I think it's safe to assume they can handle library
installation.
That said, Zed is blowing this way out of proportion.
~~~
davidw
> Just give your users a manual on how to install gems (or pip, or whatever)
> and let it be.
Uh... that might work for one language, but for 10? 20? And some poor sysadmin
is supposed to use all these disparate tools to install security fixes too?
And you're forgetting that there are "end-user" programs that depend on those
libraries as well, so they really do need to be packaged up.
> That said, Zed is blowing this way out of proportion.
That goes without saying.
~~~
poet
_And you're forgetting that there are "end-user" programs that depend on those
libraries as well, so they really do need to be packaged up._
I'm not forgetting this. I don't think the user's programs should use the same
libraries (or even the same language install) that the distro's programs
depend on. This way, if the user messed something up (like trying to upgrade a
library), the entire distro wont be hosed. It's a really bad idea from a
stability, security, and design standpoint if you don't keep these two things
separate.
------
Corrado
Actually, the Ruby and Rubygems Squeeze packages have been in quite a flux
lately. One of the more recent Ruby1.9.1 packages broke Rubygems completely
due to some changes in the Ruby 1.9.2 source and they way they handle gems.
The latest version of Ruby has really integrated the Gem package management
system into its core. It has therefor been decided to stop building a separate
Rubygems package for the 1.9.x series and let the Ruby1.9.x package deal with
gems.
Have their been clashes between the Ruby and Debian communities? Yes! Are we
all working toward a solution? Yes! Will it happen instantly? Unfortunately
no, but I think the two groups are now talking and are making some good
progress. Keep an eye out for some good stuff in the future.
------
legooolas
[Disclaimer: I'm not a Ruby developer, but have developed in plenty of
languages, as well as being a sysadmin for a fair while]
IME, separate package-management for each and every language is more painful
than the problem he is describing.
Rubygems, CPAN, whatever other languages use.
As soon as you install things outside of the distro package-manager, it has no
idea what is going on with those packages and so continues to think that they
aren't installed. If there was a way to get the package-managers for language
libraries and the distro to play together nicely and work for dependencies etc
then this would be OK, but as it is it's a bit of a sysadmin nightmare.
Edit: I thought that installing distro-supplied packages into /usr/local was
against the FHS?
~~~
davidw
> way to get the package-managers for language libraries and the distro to
> play together nicely and work for dependencies etc
Yes, hacking the language's package manager to talk to the distribution
package manager is probably the only way to really go about solving the
problem in an elegant way, and even that is not without its pain points.
~~~
legooolas
> pain points
Absolutely, but this problem has been around for a long time, and fighting
with distributions on how to get it to work won't help anyone.
Of course, it also means that if you install a buggy version of a library
from, say, CPAN then you could break distro-supplied packages... I guess it
depends on which you value more: distributor-provided testing or being able to
install out-of-band packages onto your distribution more easily and with
proper dependencies.
------
mgunes
It might be worth re-evaluating this rant in the light of Lucas Nussbaum's
(Debian Ruby maintainer) recent "mythbusting and FAQ" post:
<http://www.lucas-nussbaum.net/blog/?p=566>
~~~
auxbuss
This is far from being a rant. It's a concise exposition on the Debian/Ruby
issues.
Thanks for posting the link, but being labelled a rant, I nearly didn't go
there. Just want to encourage others to read it for an excellent explanation
of the issues at hand from the guy who is doing the work.
~~~
telemachos
I'm pretty sure that "this rant" in the parent refers to Zed's post not
Lucas's. The parent's point being "we should look at Zed's rant again after
reading this discussion by a Debian Ruby maintainer."
~~~
mgunes
Exactly.
------
pbiggar
I think there is an important point here. Debian has a lot of control, and
takes very little responsibility. A similar situation:
<http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=380731>
~~~
protomyth
wow, just wow. You would think that if you volunteer to maintain something,
you would actually like what you are maintaining.
------
grandalf
Hypothetical question: Now that rvm has been created, would anyone still use
apt-get to install ruby even if the latest version was supported?
Sure you could argue that yes, ruby should be installed with apt-get and
alternative versions should be handled with Debian's alternatives
infrastructure...
I think this is an interesting case in which version numbers are more than
just version numbers, they are more like sister projects, and they don't fall
neatly into the "conservative and stable" or "bleeding edge and risky" camps
the way maintainers typically view different versions.
From the perspective of a package maintainer, if we had to include an
alternatives list for every dot version of every package, then the distro
would explode in complexity.
Ruby and Python just happened to grow so quickly that their growth didn't
immediately trigger the appropriate response from distro maintainers, and very
quickly the community worked around the problem.
~~~
prodigal_erik
apt-get or yum for ruby? Hell yes. After wasting countless hours on random
versions of crap scattered around some servers' filesystems but not others,
now nothing hits production without being in a package that installs and
uninstalls cleanly and declares everything it relies on being present. That
includes my own code, which we package ourselves.
~~~
__david__
Hear hear. Nothing strikes fear in my heart like a /usr/local hierarchy filled
with a million random versions of random programs. BTW, if you are going to
install something into /usr/local, at _least_ use a cheap and easy packager
like GNU "stow".
~~~
grandalf
I used to think that. But consider:
A user can manage his/her own stuff in ~/bin and add it to path if he/she
wants.
A sysadmin can put things somewhere like /usr/local/experiment33 and make
sweeping changes just by changing the default path, which a user could
override for customization.
------
nakkiel
Obviously, he should spend just a few hours in the shoes of a package manager.
Honestly. What would it be without packages/ports? A mess.
The only people trying to make it bearable here are the package/ports managers
and yet they don't get any kind of reward for their job. They have to come up
with crazy tricks to make things just work because people who write software
are unable to write proper install notes, list dependencies correctly, etc..
This process is heavy, slow and doesn't always produces the expected results
(there's no doubt about it). So people thought it would be great to have just
language-specific package management systems and make it unbearable again.
Alright. I personally never use them and install stuff at hand unobtrusively.
Now, do your job. Go fill a bugreport. Or better yet; fork. This is not
discussing things, this is not helping. Tears don't help.
------
whatajoke
I have never used gentoo. Does gentoo patch packages as much as Debian? Or is
their build process sophisticated to allow parallel install of multiple
versions of say postgresql?
~~~
blasdel
Gentoo avoids making patches specific to itself wherever possible, when it
does it's generally just to make it compile+work, the changes are
straightforward, and they are _applied live on your machine_ in a self-
documenting way.
The build process is also sophisticated enough to be able to 'slot'
potentially-incompatible versions of the same package so that they can be
installed simultaneously. From my recent snapshot of the package repo I can
install 5 different simultaneous major versions of postgres, with a variety of
independent build options.
------
mariana
You know, maybe somebody would want to know the Debian POV on this issue:
* <http://www.lucas-nussbaum.net/blog/?p=566>
* <http://www.lucas-nussbaum.net/blog/?p=575>
Debian is all about stability and volunteer effort.
Whining is not going to fix things.
------
xinuc
I'm okay with Debian packages, unless someone include locusts as dependency
<http://xkcd.com/797/>
------
wazoox
This is an Ubuntu package problem, not Debian. In case you didn't notice,
Ubuntu is built from a snapshot of Debian /unstable/. They call it unstable
for a reason, I guess.
On Debian stable "aptitude install rubygems" followed by "gem install rake"
just works. This looks like a completely gratuitous rant, I've seen Jed Shaw
better inspired.
| {
"pile_set_name": "HackerNews"
} |
The A.C.L.U. Needs to Rethink Free Speech [opinion] - artsandsci
https://www.nytimes.com/2017/08/17/opinion/aclu-first-amendment-trump-charlottesville.html?action=click&contentCollection=U.S.&module=Trending&version=Full®ion=Marginalia&pgtype=article
======
meri_dian
>"For marginalized communities, the power of expression is impoverished for
reasons that have little to do with the First Amendment."
So why are we talking about the First Amendment then? Let's address the real
problems affecting marginalized communities.
| {
"pile_set_name": "HackerNews"
} |
The value of mentors in accelerator programs - mtw
http://montrealtechwatch.com/2011/06/28/the-value-of-mentors-in-accelerator-programs/
======
paisible
After SF, Boulder, New York and Vancouver, it's time for the european capital
of North America to get an accelerator program. Pretty much everyone who comes
to Montreal ends up wanting to stay here, so this is a golden ticket for
anyone looking for a bit of variety! Looking forward to seeing what talent we
attract - the advisor list is pretty impressive in any case.
| {
"pile_set_name": "HackerNews"
} |
Ask HN: 14 days off work, help me pick a project - Nemant
I've got 14 days off from work and I want to build a SaaS/app to generate some income to substitute my job.<p>Need help/suggestions on what to build:<p>A)
Idea: iOS/Android app that lets you "review" Facebook friends anonymously (imagine Amazon style reviews). Reviews are private, only the recipient can read them. E.g. you smell good but your time-keeping is horrible.<p>Business model: Get acquired.<p>Issue: Difficult marketing. No income for a while until funded.<p>B)
Idea: Small entrepreneurs need finance ($5k-$30k to open a local coffee shop), other people have small amounts saved up ($5k-$50k) and don't know where to invest. Solution: Connect these people through a network (imagine linkedIn).<p>Business model: Entrepreneurs get charged to upload a business plans. Investors get charged a monthly fee to view profiles of entrepreneurs and their business plans.<p>Issues: Chicken and egg problem.<p>C)
Idea: Replicate FollowGen:<p>http://edu.mkrecny.com/thoughts/twitter-should-shut-me-down<p>This will possibly make me some money until I get shut down, but by then I will have a proper start-up on the side (maybe A or B or other more convoluted ideas).<p>D)
Idea: Web based server monitoring tool (CPU, RAM, etc...) with a pretty UI, e.g.:<p>http://dribbble.com/shots/1023229-Ultramarine-Admin?list=searches&tag=dashboard&offset=16<p>http://dribbble.com/shots/1315388-Dashboard-Web-App-UI-Job-Summary/attachments/184703<p>Analytics could be stored on the cloud to give web access.<p>Business model: Subscription?<p>Issues: Super small market? Sys admins know their command line stuff, no need to buy extra crap.<p>E) Any other suggestions are appreciated<p>UX/UI: Pay a designer at the end to build a pretty UI with all the latest.js/one page/flat/parallax and other crap.<p>Remember I only have full 14 days and then it's evenings/weekends.
======
junto
I'm going to go out on a limb here and suggest you go on holiday instead; jump
on a flight to somewhere fun, put your feet up and enjoy some sun and interact
with some locals and other travellers. You'll come back with a ton of ideas.
~~~
Nemant
Already did that a few times in the past year + I am going on a holiday soon
:)
------
karterk
I think you're approaching this in the wrong way. And, trust me, I have been
there and done that over and over again and failed numerous times. People work
on a side project for multiple reasons, and give that your reason is simply to
augment your primary income, there is only one thing you should be looking
for: people who are willing to put down their money to pay for what you want
to build for them. Is that easy? No, but there are tonnes of crappy software
out there which businesses pay for, and if you replace one of them, you can
become ramen profitable pretty soon. What you really need to do is pick an
industry (boring the better) that you know well or have good contacts in, and
spend some time studying how they work and what you can do to make their life
better.
You can of course also hypothetically build a product and hope to succeed, but
that will be leaving too much to chance.
------
ideaoverload
> D) Idea: Web based server monitoring tool (CPU, RAM, etc...) with a pretty
> UI
> Issues: Super small market?
This is definitely not a small market and there is plenty of competition:
[https://news.ycombinator.com/item?id=7340001](https://news.ycombinator.com/item?id=7340001)
[http://blog.dataloop.io/2014/01/30/what-we-learnt-talking-
to...](http://blog.dataloop.io/2014/01/30/what-we-learnt-talking-
to-60-companies-about-monitoring/)
------
allendoerfer
You could combine D) with some service products e.g. installation of tool +
monitoring of tool. Let the user provide SSH-Credentials and stack together a
built system, which automatically logs in and installs some packages with
common configuration, the user has chosen before. Make this service free and
charge for the monitoring. More complex configuration can be sold later on.
Sell it as System-Administration as a Service without a lock-in, this way you
can target web designers or business people.
~~~
stevekemp
I've done system administration for many years, as a remote-worker.
Trying to sell expertise in this field is hard, even when you have a lot of
knowledge and contacts. The biggest issue is that people don't want to give
remote logins to systems, to people far away.
I setup a trivial page at [http://remote-sysadmin.com/](http://remote-
sysadmin.com/) but getting attention has been very hard, even though I got a
few contacts along the lines of "Please automate Debian updates", or "Please
install plugin Foo for wordpress, plz".
~~~
allendoerfer
Well, if that is the problem, why not add a photo of you wearing a suit, an
address, a short resume and some customer photos/logos with feedback?
------
jason_slack
I could give you my opinion but I think you should pick what sounds most
exciting for you brain.
Excitement == motivation to work on it!
~~~
Nemant
I'm motivated to work on all of them! But since there is an experienced
audience here I could get some pointers/feedback that I might've overlooked.
------
staunch
Something you charge money for.
~~~
elwell
I agree, but would add: "and that generates money for your customers".
------
amac
Do you like commerce? I need help with Octopus (octopus.org).
| {
"pile_set_name": "HackerNews"
} |
How LinkedIn betrayed a 5-man startup - gsibble
http://thenextweb.com/insider/2012/06/22/how-linkedin-betrayed-5-man-startup-pealk-and-why-developers-should-be-concerned/
======
objclxt
I think one of the messages here is that if you're a start-up and your product
requires a third-party API that you have no control over to function then
you're taking a massive risk.
This isn't specific to Linked.In: API access isn't a right. If your product
requires a third-party data source and you _don't_ have any contractual
agreement with that third-party you have to accept that the rug could be
yanked from under your feet at any moment.
And maybe that risk is acceptable to you and your investors - but to not have
a backup plan is foolish.
~~~
legutierr
But what about all of those software businesses that have built on top of the
iOS API + App Store? Would you say that they are also taking a massive risk?
Apple can cut off any app that it wants at any time, almost for any reason,
and yet thousands of independent developers are building profitable businesses
on top of their platform.
The issue is not a dependency on an API or platform. The issue is the nature
and character of the company upon which you are depending. In spite of some
well-publicized hiccups, Apple has made an effort to nurture its developer
community.
The point of the article is not that dependency is bad (although it may very
well be), but rather that LinkedIn is a bad partner, and it's not worth
spending any effort to build a business on their platform.
What I'm curious about is what companies are similarly risky to build your
business on.
~~~
balloot
It's not that complicated. If you build your company on the back of another
company you must abide by their TOS. If your business plan is to build the
best porn app on the App Store, you would be similarly disappointed.
In general, if your business plan relies on the goodwill of another company
and you provide little value to that company in return, you don't have a very
good business plan.
~~~
legutierr
I don't think it's enough to rely on the TOS. Sure, there are obvious examples
like porn on the iPhone where the TOS will give you a clear answer.
Frequently, however, the TOS are ambiguous and are interpreted inconsistently.
They can also be changed at will.
Sure, you're right that developers should try to give as much benefit back to
their platform provider as they derive from the platform. But the fact is that
different companies perceive benefit in different ways with reasoning that is
opaque. Often a decision can depend on a strategic direction for the firm that
you are not privy to.
All this is why the reputation of the company matters so much. From reading
the article, it looks like LinkedIn not only acted in a very inconsistent
manner--giving praise before shutting them down--but they even began reverse-
engineering the application. That's not the kind of behavior that breeds a
healthy ecosystem regardless of whether they have all the right to act that
way, and regardless of whether Pealk was careless in relying so heavily on
LinkedIn.
If everyone were to take a cautious approach to the APIs and platforms that
big companies offer, as has been suggested in this thread, then the tech
sector would be a much less interesting place. The fact is that a lot of
innovation happens because there actually are API and platform providers out
there that care about their developers and won't pull shit like this.
~~~
balloot
You're complicating something that isn't that complicated. LinkedIn's TOS very
clearly states in the terms that it is illegal to "Use the APIs in an
Application that competes with products or services offered by us"
So when you see:
<http://talent.linkedin.com/Recruiter>
And your site is using the API to create a service for recruiters, it seems
crystal clear to me that you are violating the LinkedIn TOS. This is not some
grey area or some arbitrary changing of the rules - it is a very clear
violation of LinkedIn API guidelines. I just don't see how this proves even a
little bit that LinkedIn doesn't "care about their developers".
And for the bigger picture, it is amazing to me that a significant number of
people expect LinkedIn to sit around while some other company attempts to
undercut LinkedIn's own premium product using LinkedIn's API.
------
gergles
Nobody can possibly think it is a good idea to use someone's API to create a
product that directly competes with the API providers' paid offering.
The sense of entitlement to be allowed to use someone's product against them
frankly baffles me.
~~~
oz
>Nobody can possibly think it is a good idea to use someone's API to create a
product that directly competes with the API providers' paid offering.
I thought as much. If they allow it, then great, but you're asking for
trouble.
The cross-platform messaging app Kik was pulled from Blackberry App World for
similar reasons:
<http://www.kik.com/blog/2010/11/rim-blackberry-kik/>
Edit: Added 'App World' clarification.
------
pygorex
If your product is completely dependent on a third-party API or service in
order to function you really only have two possible exits:
1. Get bought out by the company implementing the API
2. Get shut out by the company implementing the API
This should serve as a cautionary tale: never place the fate of your company
in the hands of someone whose interests may not align with yours.
~~~
tjic
Excellent point.
Corollary: when the other company is figuring out how generous of an offer
they need to make for choice #1, they know that your next best alternative is
choice #2 (for $0).
------
robomartin
Business is war. Anyone who doesn't think so hasn't got the enough scars to
understand this yet. As a small entity, any time you tangle with large well-
funded companies you have to be very aware of the fact that they could shaft
you in a dozen different ways whenever the please. That's the hard and cold
reality of it.
Contracts and agreements only serve to possibly give you the option to sue,
and nothing more. They don't create a guarantee of recovery or reparations in
any way at all.
I had one very painful incident with a large Korean multinational corporation
a few years ago. This was over a hardware design. We devoted about eight
months --nearly a million dollars in cash and man-hours-- to complete a set of
designs based on components and assemblies that this OEM was to provide.
They wanted to get our business and pull us away from their main competitor.
In-person meetings where had with the top three VP's in the US. Promises were
made both verbally and on-paper. Short version: When we were finally ready to
go into production we were told that the components in question --components
they had recommended and guaranteed as long-availability components-- had been
discontinued. Their recommendation: "You need to redesign your product line".
Soon afterwards I learned that other companies that had selected the same
components were under the same dire situation. One particular company had
closed a deal with the US government and was about to get sued for tens of
millions of dollars for not delivering on their contract. On our front, this
one event nearly destroyed my company at that time. We survived only to get
taken out during the economic downturn due to having been weakened by this
event.
In looking at the potential to sue this company our attorneys concluded that
they'd need a minimum of $250K just to consider pulling that trigger and
another $250K available past that. They expected this multinational to simply
bury us with paperwork and an army of lawyers, with the only goal being to
cause financial pain and get us to quit. Their recommendation: "Figure out how
to survive and lick your wounds. It is nearly impossible to go after these
guys and actually come out ahead".
This is how large companies, effectively, make their own rules, their own
laws, if you will. While this wasn't the only tangle I've ever had with a
large company it was the worst I've had to endure. I was the first domino that
got tipped over in a chain of events that ultimately killed a business that I
spent the better part of ten years building.
Be careful.
~~~
klbarry
Would it be possible to email me the name of the company (or write it here?)
My email is in my profile.
~~~
robomartin
I don't think there's any point in posting it publicly. Particularly several
years past the event. Tell me why you want the name. If it makes sense I'll
certainly provide it privately along with further details if necessary.
------
donretag
I briefly started working on an application that would have many positive
benefits for the LinkedIn community. After reading the API terms and
conditions, it was very apparent that I would not have been able to achieve my
goals, so I stopped.
The terms and conditions are pretty restrictive, but they are provided. I
should not have come as a surprise that their API access was cut off. You
should not be developing an application that not only is dependent on one
company's API, but also against their terms of service (without an explicit
authorization/partnership).
~~~
spinlock
this is the point I came to make. LinkedIn's TOS specifically disallow apps
that cut into their turf. These guys should be negotiating to get their
codebase bought and their team acquired.
------
lancewiggs
Where is the outrage? I find this an appalling act by Linked In, who seem,
this shortly after WWDC, unaware of the power of a strong developer ecosystem.
How could you continue to work at such a place knowing this?
Why would anyone bother developing for their community via their API?
Linked In's site itself meanwhile is looking increasingly like Facebook to me
and is increasingly irrelevant to professionals. They are under pressure to
increase revenue dramatically to get close to justifying an over inflated
share price, but losing your corporate values by screwing over people in your
ecosystem is very short term thinking, perhaps reflecting a Wall St mindset.
------
balloot
I just don't get all the hand wringing. You are not entitled to unlimited use
of another company's data. They can shut you off at any time for any reason.
And if you build your company around someone else's API, an essential part of
the business plan is preparing for the day when the API-providing company
takes notice of what you're doing and evaluates your use of their service,
with the expectation that they may shut you down if you threaten them.
If your only response to this is "go cry to various small tech publications",
then you're doing it wrong.
~~~
wslh
> If your only response to this is "go cry to various small tech
> publications", then you're doing it wrong.
I don't think so. The main issue is that people think that there are APIs
because companies promote their interoperability to gain developers. Companies
are not being sincere and nobody is taking notice. To not repeat this and
other opinions I post one of my recent posts on the subject: "Reverse
Engineering and The Cloud" [http://blog.nektra.com/main/2012/06/01/reverse-
engineering-a...](http://blog.nektra.com/main/2012/06/01/reverse-engineering-
and-the-cloud/)
------
jval
I completely disagree with other people's attitudes on this point - startups
definitely need to be wary of using third party APIs, but it is incumbent upon
companies to create clear and stable terms for developers to operate within if
they are thinking of opening an API. It's completely unfair to make
representations to people that they rely on to their detriment and then renege
at the last minute.
Let's be clear too - it's not as if these guys were making a porn application
or something that violated LinkedIn's TOS. Nor were they acting without
LinkedIn's consent. Companies reserve rights to revoke access without reason
in order to limit their legal liability but they need to be wary of the non-
legal consequences to their reputation.
This is an unmitigated disaster for LinkedIn. Obviously if it is a
professional network, and they as a company already offer a number of
solutions to recruiters, companies and professionals, then almost every app
made for their platform is going to compete with their own offerings in some
way.
If they aren't willing to set clear boundaries in this regard then they simply
need to shut down the API and stop wasting everyone's time. In any event, even
if they leave it open they can forget about people making a serious investment
in the platform.
------
davesims
I agree, and Pealk definitely admitted as much, that for a startup, relying on
an API is incredibly risky. I'd never be on board with a startup that did, no
matter how cool the idea.
But LinkedIn doesn't come off looking great either, and I think that was the
thrust of the article. It's pretty concerning that they would spend eight
weeks talking nice with Pealk, all the while sussing out Pealk's position, and
then shut off access with no opportunity to negotiate.
What LinkedIn did wasn't illegal, perhaps not even unethical, considering what
passes ever-so-vaguely for 'ethical' in the business world these days. But it
was, in the strictest definition of the word, crappy.
~~~
flatline3
I wouldn't even call it crappy. LinkedIn truly didn't owe them _anything_.
~~~
Jare
In this context, LinkedIn owes them respect and honesty. Bait and switching is
crappy behaviour.
~~~
flatline3
Why? They were abusing LinkedIn's service by competing with them.
LinkedIn took the time to speak with them, evaluate the situation, and then
make an informed decision about whether they wanted to support their
appropriation of LinkedIn's data.
~~~
davesims
It's crappy human behavior, which is not to say 'bad business practice', to
pretend to work with someone and support their efforts for 2 months, and then
summarily shut down their business simply because LinkedIn made an 'informed
decision' that 'we don't owe you anything.'
Let's not confuse the fact that, although businesses certainly have a right to
defend their own interests any way they see fit within the law, that has
absolutely no bearing on whether or not the community they conduct business in
will call them out when they act in a basically unkind or deceptive manner.
The former has to do with business, fine print and the dog-eat-dog, watch-
your-back world of modern commerce. The latter has to do with basic human
decency, trust and the goodwill of the community you do business in.
"It's just business" is all well and good, but don't get butt-hurt when the
community that you work with looks at your behavior and says, "it sucks that
you did it that way." That's just the cost of "doing business," right? Suck it
up and take it.
------
orthecreedence
Same goes for Facebook or any other big company. If your application depends
on them heavily, then they can shut you down with the flip of a switch. If
you're not making them money directly, they won't care either. I make the
distinction between big and small companies here because if you depend on a
small company, there's a lot more chance of opening a dialog and working out
some kind of arrangement.
That said, to not hinge the entire outcome of your company on another company
is, I feel, business 101. Doing so is a very high risk.
My feelings go out to the guys who started the company though...mistakes or
not, it would truly suck to see all your hard work flushed down the toilet.
------
jonrussell
Hi, author here, don't usually [ever] comment on my own work here - but am
interested to hear if other devs have had similar or contrasting experiences
working with LinkedIn.
~~~
gaborcselle
In your post, you say "After news of the API cut off, LinkedIn got in touch to
suggest a possible trip to its San Francisco HQ" - IIRC, LinkedIn's HQ is in
Mountain View, not San Francisco.
~~~
dllthomas
From France, there's not so huge a difference, is there? You're still probably
flying into SFO.
~~~
jayp
A factual error is a factual error. Definitely lowers the credibility of the
article for me.
~~~
eropple
That's a foolish and shortsighted way of looking at it. My company's
headquarters is in Newton, MA, but if we're flying people in to talk, they
universally say "Boston". Because to people not from the area, that's what it
_is_.
Synecdoche is a part of English (San Francisco being the most notable part of
Silicon Valley). It may upset the overly pedantic, but...oh well?
~~~
jayp
I agree with you from the perspective of people of France (or anyone outside
the area). And that point is not lost on me.
I am talking about the credibility of the author. The line referenced in the
article is written by the author. They are not words attributed to others in a
quote. If it's not in a quote, the author takes on the responsibility with
respect to the correctness of the fact. If the author can't verify this simple
fact, I don't know if I can trust the article with all the one-sided claims
presented.
~~~
dllthomas
That they can't be troubled to verify facts that don't have any relevance
whatsoever to the content of the post should only very weakly effect your
expectation of whether they verify facts that are central to the content of
the post. Granted, here on the internet, that's not as high a prior as it
might be.
------
rtcoms
LinkedIn API terms of use says :
Sell, lease, share, transfer, sublicense any Content obtained through the
APIs, directly or indirectly, to any third party, including any data broker,
ad network, ad exchange, or other advertising or monetization-related party..
Charge, directly or indirectly, any incremental fees (including any unique,
specific, or premium charges) for access to LinkedIn's Content or your
integration of the APIs in your Application;
Does it mean that one cannot monetize applications created using linkedin api
??
------
maslam
I developed wiserprofile.com to help people build better LinkedIn profiles,
and it works entirely against LinkedIn's API. I too found that some of the
cool features that I really wanted to build were not allowed by the LinkedIn
developer agreement. It's too bad LinkedIn is not willing to open up their API
set to allow an ecosystem to flourish - let the developers loose guys and
watch your platform grow!
------
CookWithMe
If a company is both a key partner and a competitor at the same time, your
business model is faulty.
That said, the LinkedIn API could open certain features only if you act on
behalf of a paying LinkedIn user. That way, both LinkedIn and the startups
could happily co-exist, even though the startup recreates some of LinkedIns
functionality.
------
AznHisoka
To be honest, this app sounds like a feature LinkedIn can easily implement on
their own. There is no viable business here. The real thick value lies in the
data.. that's 99% of the work. The app just adds 1% value on top of it.
------
T_S_
"Put up again thy API into its site: for all they that take the API shall
perish by its terms of service."
------
tubbo
You may want to sell that LNKD stock you're holding. It's all downhill from
here.
The kids (college kids and those who just graduated) are adding me on
BranchOut, not LinkedIn. I'm the one who usually has to issue the connection
request on LinkedIn, because _nobody is fucking using it_ except people
looking for a job and recruiters. It's mostly recruiters. But recruiters are
gonna go where they perceive the people to be, so once the facade has fallen
LinkedIn is FUCKED. That stock price is gonna start tanking once companies
like BranchOut (built into the existing Facebook platform we all know and
love) start gaining more ground.
Running your company with the intent of destroying all competition is like
running your country with the intent of destroying all people who don't look
like you.
------
leptons
"pealk" is a stupid name. I'm not even interested in learning about your
product if you named it something stupid. What is it with web start-ups and
stupid names?
~~~
geoka9
Do you think "Microsoft" is stupid? I do. Or "Apple", for that matter.
~~~
anthonyb
Not to mention "leptons"
| {
"pile_set_name": "HackerNews"
} |
Mayday Pac now taking Bitcoin - skilesare
https://mayday.us/bitcoin/
======
kolev
I believe that (pseudo)anonymous contributions should not be allowed for
political and public interest purposes. In this case, Mayday PAC collects full
PII, but do they validate?
| {
"pile_set_name": "HackerNews"
} |
WHOIS for Twitter - abava
http://servletsuite.blogspot.com/2011/07/whois-for-twitter.html
======
piers
So this is basically a front end for the Google social graph API?
| {
"pile_set_name": "HackerNews"
} |
Popcorn Time for the web permanently shuts down - cgtyoder
http://www.theverge.com/2015/10/20/9576803/browser-popcorn-shuts-down
======
joopxiv
The qualification 'permanently' might be a bit premature...
_He might also open source the code behind Browser Popcorn, so it 's possible
that another developer will revive it in the future._
| {
"pile_set_name": "HackerNews"
} |
Show HN: Duotones – Instant duotone effect generator made with SVG/Canvas - masonhipp
https://duotones.co
======
matlk
Interesting project for sure.
I really enjoy the results.
| {
"pile_set_name": "HackerNews"
} |
Study shows US has slower LTE wireless than 60 other countries - prando
https://uk.finance.yahoo.com/news/study-shows-us-slower-lte-wireless-60-countries-102305054.html
======
mschuster91
Warning, this article on Android (Chrome) redirects to a fake Amazon page
after ~10-15s read time which adivertises "you have won a free iPhone 7" and
hijacks the "back" button.
I'm not surprised Yahoo went downhill when such crap passes through the ad
quality controls.
------
vortico
I'd say that's reasonable considering it's the fourth largest country in land
area and its population is spread much more evenly than Canada or Russia. And
despite this, the article says the availability ranking is #4.
~~~
rando444
I don't know about reasonable. I mean if they have the availability then this
just means their networks are the problem.
The telecoms in the US should have much better networks, especially after
stealing hundreds of billions of dollars from US taxpayers for promises that
were never delivered.[0][1]
[0][https://www.ntia.doc.gov/legacy/broadbandgrants/comments/61B...](https://www.ntia.doc.gov/legacy/broadbandgrants/comments/61BF.pdf)
[1][https://www.amazon.com/Broadbandits-Inside-Billion-
Telecom-H...](https://www.amazon.com/Broadbandits-Inside-Billion-Telecom-
Heist/dp/0471660612)
~~~
StillBored
This reminds me of the fact that it seems in Austin, the TWC/Charter merger
seems to have done little to improve the speeds, despite a small price hike...
Yet, it seems that the city is now covered in shiny brand new ford transit
vans with Spectrum emblazoned over the sides. If they put 1/2 the effort into
upgrading the network they seem to be putting into advertising and buying new
vans then I wouldn't still be jealous of my coworkers with google fiber given
that DOCSIS 3.1 is 4 years old and my modem is still locked at a fraction of
the now ten year old DOCIS 3.0 speed.
------
berbec
I live in Manhattan, on the Upper East Side, a fairly upscale neighborhood. I
struggle to reach 12mbps with T-Mobile. There are sections of the NYC subway
where I hit 40mbps. The vast difference in LTE quality even in such a dense
market as New York City amazes me. You'd think the number of subscribers per
square foot would mean there would be a tower on every block. Let's not talk
about the lack of Fios...
------
ActsJuvenile
LTE follows "scout's honor" model of self-certification. You can get your
mostly-3G backhaul to show LTE on customer's screens if you so please.
| {
"pile_set_name": "HackerNews"
} |
GitHub Android app - Stampy
http://codehum.com/stuff/android/
======
Tzr
nice app
| {
"pile_set_name": "HackerNews"
} |
Psychology of Intelligence Analysis (1999) - selmat
https://www.cia.gov/library/center-for-the-study-of-intelligence/csi-publications/books-and-monographs/psychology-of-intelligence-analysis
======
dang
A thread from 2018:
[https://news.ycombinator.com/item?id=18500075](https://news.ycombinator.com/item?id=18500075)
2017:
[https://news.ycombinator.com/item?id=14852250](https://news.ycombinator.com/item?id=14852250)
I feel like there were others but can't find them.
------
tcgv
Great book! Read after seeing its thread here in 2018, couldn't recommend it
enough.
Started spotting several cognitive biases described in the book in myself and
people around me.
Its explanation of our memory system is simple, yet very elucidating.
| {
"pile_set_name": "HackerNews"
} |
Everyone should program, or programming is hard? Both - nollidge
http://scientopia.org/blogs/goodmath/2012/10/05/everyone-should-program-or-programming-is-hard-both/
======
plinkplonk
This is a great critique of Brett Victor's "Learnable Programming" essay.
Weird to see it not getting any attention here.
| {
"pile_set_name": "HackerNews"
} |
How Baboons Think - bootload
http://www.nytimes.com/2007/10/09/science/09babo.html?em&ex=1192334400&en=4c284334784fef12&ei=5087%0A
======
queensnake
You're diluting news.yc, man. Take it to reddit. What makes yc news /valuable/
is that it's not diluted with off-topic stuff. You weaken that when you make
non-startup, non-code-related posts.
~~~
rms
_The focus of Hacker News is going to be anything that good hackers would find
interesting. That includes a lot more than hacking and startups. If you had to
reduce it to a sentence, the answer might be: anything that gratifies one's
intellectual curiosity._
<http://ycombinator.com/hackernews.html>
~~~
queensnake
When the news.yc becomes one more place to post the stale material going
around, it'll be useless.
| {
"pile_set_name": "HackerNews"
} |
When is one thing equal to some other thing? (2007) [pdf] - espeed
http://www.math.harvard.edu/~mazur/preprints/when_is_one.pdf
======
danharaj
More reading:
[https://golem.ph.utexas.edu/category/2015/02/concepts_of_sam...](https://golem.ph.utexas.edu/category/2015/02/concepts_of_sameness_part_1.html)
| {
"pile_set_name": "HackerNews"
} |
The Witness: the creator of Braid talks about his fiendishly difficult new game - binarycrusader
http://www.polygon.com/features/2015/9/17/9343943/the-witness-hands-on-preview-feature-braid-jonathan-blow-interview-ps4-playstation-4-pc
======
erikpukinskis
I _strongly_ encourage anyone interested in programming languages to watch his
series about Jai, the language he is creating for his next game. Starts here:
[https://www.youtube.com/watch?v=TH9VCN6UkyQ&list=PLmV5I2fxai...](https://www.youtube.com/watch?v=TH9VCN6UkyQ&list=PLmV5I2fxaiCKfxMBrNsU1kgKJXD3PkyxO)
It's not released yet, but watching the videos has massively changed the way I
think about programming. The notion of "data-driven programming," which I
would describe as "only move around the data you have to", has been a real
revelation to me. I sort of came out of the Ruby world (and Java before that)
where the attitude is more "Abstract all the things!"
I have been trying to apply many of Blow's principles in my JavaScript
programming and it has been awesome. I find myself staying closer to the
"metal" (which in JavaScript world is pure Node and low-level browser APIs)
and it has been incredibly rewarding.
I've also really enjoyed his policy of pragmatism over idealism. I think the
React world for example has taken a good idea (immutability can clarify data
flows) and turned it into a religion (Everything Must Be Immutable!) Watching
Blow work has given me the confidence to steer away from that.
Much in the same way that learning Rails made me a better programmer (even
though I don't use it anymore, and I've rejected a lot of its premises) I have
found learning Jai has really leveled me up as a programmer and a computer
science thinker.
~~~
hacker_9
See here for a write up up Jai's language features:
[https://sites.google.com/site/jailanguageprimer/](https://sites.google.com/site/jailanguageprimer/)
_John_ is clearly coming in from a C++ mindset and is essentially fixing
problems with the language that should have been really been fixed long ago. I
think it looks interesting, but I'm not sure it really solves any big
problems? Just seems to be a bit better syntax.
~~~
greggman
It's not just a better syntax for C++. If nothing else being able to use the
entire language at compile time is a huge step in the right direction IMO.
C++ meta programming IMO was never meant for meta programming. It was intended
do to a few simple template things, then someone discovered if you made an
array of size [-1] you could basically turn it into a turing complete
language. But, the contortions you have to go through are ridiculous. It's
like writing photoshop in bash script. I'm surprised someone hasn't written an
LLVM pick your language to C++ meta programming compiler just to not have to
deal with that crap.
Sure that feature has been around in other languages for decades (lisp) but
it's sorely missing from C++. People usually work around it by writing code
generators, almost always in other languages.
There were other interesting ideas like using the closure specifiers to
enforce function purity including all functions they call. I haven't followed
whether that has worked out or not.
On top of that syntax matters. As JBlow pointed out, every time he has to
waste time refactoring 50 lines of code for something that should be a 1
character change or spelling things out the compiler should be able to figure
out is a huge drag on productivity and motivation.
That was my #1 take away from the videos I watched. How about designing a
language for programmer joy. Most languages pick something else (multi-
processing, memory safety, ...) but whether or not it will actually be
pleasant and fun to program in is rarely considered.
Sure some people get off on solving the puzzles of contorting their language
of choice to do something difficult. Especially C++ meta programming gurus.
Others of us just want to get shit done. It's not writing the code that's fun
for us. It's the result. The faster we can get there the more we can work on
polishing and iterating.
~~~
hyperion2010
I watched all his videos and my main take away was that there is actually no
high level language besides C (and restricted subsets of C++ that don't use
templates) that can be used to tell the computer what to do. Almost every high
level language we have already abstracts away the idea that you are trying to
control a piece of hardware. Game programmers are in an ideal position to have
learned what is actually needed to make talking to a computer more efficient.
The fact that J also has the vision to build the language to account for real
world development processes is almost like an added bonus.
~~~
bsder
Presumably Rust wasn't stable enough for him to consider because that's the
single language that seems to address his complaints.
Once you bar garbage collection, you're pretty much left with C, C++, and Rust
(FORTRAN weenies can leave the room, thanks).
~~~
GolDDranks
Also, he doesn't like Rust's stance on safety -- he thinks it's going to lead
to too much hand-holding and boilerplate annotations.
~~~
pcwalton
Yes, that's the main design difference between Jai and Rust.
My _personal_ feeling is that memory safety isn't something I want to
compromise on. But, I work on browsers and not games, so I'm perfectly willing
to acknowledge that, for games, developers may be willing to sacrifice safety
for fewer type system restrictions. Browsers, critical OS components,
databases, etc. are security critical, while games aren't. I'm a pluralist
when it comes to programming languages, and I'm perfectly willing to accept
that there are _prima facie_ valid reasons for not wanting memory safety. One
thing I like about Jai is that it's very up-front about not being type-safe or
memory-safe, and specifically argues against the usefulness of memory safety
for its domain.
It's interesting, though, that whenever I've floated the possibility of
relaxing restrictions on an opt-in basis for apps that don't care about memory
safety as much the reaction of the Rust community has been lukewarm at best
and usually pretty negative, even from the game devs. It's interesting: most
people don't want to give up their memory safety, once they see the benefits
that it brings in terms of developer productivity (preventing annoying bugs).
That's the camp I'm in too, at least for the apps I write.
~~~
renox
In one of his latest video he show how it is going to work for array access:
by default it's checked, but you have a directive to make one (or a block) of
array access unchecked and then you have a global directive which can change
the default to check all the array access or NOT check all the array access.
So you get the full range from safe array access to C-like performance and
with the possibility of dropping the checks only where this impact really
performance.
Memory safety isn't limited to array access, but I really like the idea of
this kind of control (even though for library this could create issue, I think
that there are plans to allow the use of an 'ANDF' like format for libraries).
~~~
pcwalton
Bounds checking isn't as interesting as use-after-free protection (like the
borrow check) for a couple of reasons: (1) you can easily opt-in and opt-out
on a case-by-case basis (even in Rust), whereas UAF protection is nonlocal;
(2) bounds checks rarely if ever show up in profiles with a well-designed
iterator library and an optimizing compiler.
~~~
renox
I mostly agree with you (UAF is more interesting but more difficult that bound
checking), but I would nitpick on details:
\- yes you can opt-in and opt-out on a case-by-case basis in Rust for bound
checking but can you globally remove checks and globally force checks?
\- as for UAF protection being non local, a true UAF protection true but a 99%
solution such as overwriting the data with know garbage on free (0xDEADBEEF)
and ensuring that the allocator doesn't reuse virtual addresses may be good
enough in practice, no? But yes it's probably too expensive to release with
such configuration..
~~~
pcwalton
> \- yes you can opt-in and opt-out on a case-by-case basis in Rust for bound
> checking but can you globally remove checks and globally force checks?
No. Nobody has asked for it, because nearly all bounds checks in Rust just
don't matter from a performance point of view, and thus there is no reason to
remove them.
> \- as for UAF protection being non local, a true UAF protection true but a
> 99% solution such as overwriting the data with know garbage on free
> (0xDEADBEEF) and ensuring that the allocator doesn't reuse virtual addresses
> may be good enough in practice, no?
You can't avoid reuse of virtual addresses. That would have terrible cache
implications.
The only thing that sorta works is something like WebKit's PartitionAlloc, but
that only mitigates some exploitation scenarios. There is no easy solution to
UAF.
~~~
renox
> You can't avoid reuse of virtual addresses. That would have terrible cache
> implications.
\--> terrible TLB implications
> There is no easy solution to UAF.
Agreed.
------
teddyh
Buried lede: “ _While it 's been a long and winding road,_ The Witness _is
nearing its end. Blow has a little over four months until the just-announced
release date of January 26, 2016._ ”
~~~
devindotcom
It's not really the lede, though.
------
maxander
My goodness, but that's pretty for a small-studio game.
Its worth noting that, assuming this actually comes through, Blow will have
overcome one of the most pernicious traps that wait for successful game
designers- that next, wondrous game you make once you have all the resources
you could possibly dream of and no excuse not to follow every possible design
ambition. Compare Duke Nukem 3D, or 0x10c.
~~~
xerophyte12932
Yes its refreshing to see them take a complete departure from Braid instead of
just seeing Braid as "formula for success" and re-iterating it. (I am looking
at you Angry Birds).
I like how he used his financial freedom to follow through with something he
really wanted to build in a way that he feels justifies his dream
------
teddyh
The web site of the game itself ([http://the-witness.net/](http://the-
witness.net/)) seems down at the moment, but their Twitter account has links
to more articles:
[https://twitter.com/witnessgame](https://twitter.com/witnessgame)
~~~
PopeOfNope
Wordpress. Nuff said.
~~~
PopeOfNope
To all you twits downvoting me, proof:
@Jonathan_Blow: "Sadly I did not manage to get us off WP in time, and surprise
surprise, our blog gets crushed immediately every time I reboot the machine."
[https://twitter.com/Jonathan_Blow/status/644564200275054592](https://twitter.com/Jonathan_Blow/status/644564200275054592)
~~~
steveklabnik
You're not being downvoted because you mis-characterized the software stack.
~~~
PopeOfNope
Care to enlighten me, then? It's as on topic and relevant to the posted story
as the rest of the comments here. The best I can figure is people just don't
like my tone, which is childish.
~~~
steveklabnik
> which is childish.
uh huh
~~~
loco5niner
[https://xkcd.com/1124/](https://xkcd.com/1124/)
------
tempodox
I wish the temporal delta between buzz generation and game release were
smaller. By the time the game will be released, I will have forgotten all
about it.
~~~
rtpg
I've been looking forward to this for an extremely long time. But it looks so
promising that I go to the game blog once a month or so to see what's
happening
------
stringham
I have been following the progress of this game on the-witness.net for a few
years. Really exciting that they've got a release date now.
| {
"pile_set_name": "HackerNews"
} |
Lock-Free Work Stealing, Part 1: Basics - tivolo
http://blog.molecular-matters.com/2015/08/24/job-system-2-0-lock-free-work-stealing-part-1-basics/
======
audidude
I don't see any mention of where the author mitigates, or prevents the ABA
problem.
[https://en.wikipedia.org/wiki/ABA_problem](https://en.wikipedia.org/wiki/ABA_problem)
\- Some systems will use a double-CAS where one word is a counter.
\- People lucky enough to use RCU can use grace-periods.
\- You can also mitigate it (to a degree) using the bottom bits of the pointer
and the kernel area of the pointer as a counter.
\- Also, Hazard Pointers.
~~~
mattnewport
This post doesn't go into his lock free work stealing queue implementation at
all, he says that will be covered in a later post. I don't see how the ABA
problem relates to any of the implementation details covered in this post?
~~~
uxcn
Implementing a lock-free work stealing queue isn't difficult, especially with
C/C++ atomics. I can't think of an implementation where ABA would be a
problem.
------
mintplant
Is there somewhere I can read about the Molecule Engine as a whole? I'm
struggling to dig up information on what it is from this website. A game
engine of some sort?
~~~
SloopJon
The home page describes it as "the foundation for all our tools, with parts of
the engine being available to licensees as stand-alone game engine bits."
This post suggests that it is intended to compete with Unreal and CryEngine:
[http://blog.molecular-matters.com/2015/08/12/what-
happened-t...](http://blog.molecular-matters.com/2015/08/12/what-happened-to-
the-molecule-engine/)
One of the posts about runtime-compiled C++ states that iOS is not a target
platform.
| {
"pile_set_name": "HackerNews"
} |
IPhone Tethering Returns To Apple’s App Store - dawie
http://www.techcrunch.com/2008/08/01/tethering-app-returns-to-apples-app-store/
======
charlesju
I love this app. I wonder if AT&T will strike it down though? I think that
since it's a little bit hard to setup, perhaps it can fly under the radar for
a while.
Do you guys think this app will last till the end of the year?
~~~
andreyf
_Do you guys think this app will last till the end of the year?_
If it doesn't, I wonder if Apple will make it disappear from "our" iPhones?
------
nickb
I grabbed it just in case it disappears again. Extremely useful!
~~~
ashu
Trivial to do tethering (especially for hackers) if your iPhone is jailbroken.
~~~
pjhyett
There's nothing trivial about it, it involves a multi-step process that this
app eliminates with one click.
~~~
ashu
If I understand it correctly [1] even with this app, you do have to do all the
setup: set up the ad-hoc network, get your iphone to connect to the ad-hoc
network, set your browsers SOCKS proxy, etc. The only difference is that this
app creates a socks server on the iphone with one click, whereas with a
jailbroken phone you need to do this on your laptop/desktop with a SSH one-
liner.
I don't see much difference. I did say that this is for jailbroken phones
only, so if you've jailbroken your iPhone, you can easily run a simple ssh
command? Am I missing something?
[1] [http://www.engadget.com/2008/08/01/netshare-iphone-
tethering...](http://www.engadget.com/2008/08/01/netshare-iphone-tethering-
app-reappears-in-the-app-store/)
~~~
dschoon
You're not. I've been tethering my jailbroken phone for a week now, and the
process is exactly as you describe.
The only difference with the app is the ability to trade one line of typing
for $10.
~~~
wastedbrains
And your phone doesn't need to be jailbroken... downloaded and installed on a
non jailbroken 3g and was up and running no problems.
------
grendel
bought the app because i enjoyed so much of their work on my jailbroken 1.x
iphone.
| {
"pile_set_name": "HackerNews"
} |
Do We Need Specificity in CSS? (2015) - edjroot
https://philipwalton.com/articles/do-we-actually-need-specificity-in-css/
======
yohannparis
I feel like this article is just another: "CSS is hard, and instead of
learning, use this clever method that works for me."
Maybe I'm just old school because I learned CSS2 by reading the docs, and not
as CSS in JS.
~~~
HelloNurse
Not only approaching the "problem" of learning CSS by eliminating things to
learn, but starting from bad assumptions (e.g. that combining stylesheets is a
problem and not a solution) and never questioning them.
------
ryannevius
A rebuttal: [https://codepen.io/davidkpiano/post/the-simplicity-of-
specif...](https://codepen.io/davidkpiano/post/the-simplicity-of-specificity)
~~~
davidkpiano
Oh wow, I barely remember writing this 4+ years ago!
I still hold to it - specificity is a necessary mechanism, and if it's hard,
it's because we're unnecessarily over-complicating it, just like anything else
considered "hard" if we misuse it.
------
tiborsaas
Using SASS or LESS makes specificity a great tool
.something {
.tells {
.me {
content: 'we do';
}
}
}
If you think of specificity as a kind of inheritance or scoping, then it makes
a lot more sense.
Also it's very powerful to extend a generic component:
Framework:
button.red { background: red; }
User code:
button.red.disabled { background: grey; }
~~~
d_watt
For me, it's actually the exact opposite. When I use scoping like that, my
normal expectation is that I'm saying "Only style .me if it's in a .tells
that's in a .something". I'm doing it for selector behavior. I don't want to
have to consider if I later scope a
.something_else { .me: content 'we don't' }
it won't work because it's less specific, meaning cascading isn't working as
expected. It isn't SOLID
~~~
woodrowbarlow
exactly. more specific selectors are edge cases, which is a usually-acceptable
heuristic for rule priority but doesn't correspond 1:1.
------
mehrdadn
Warnings seem like the way to go for this? i.e. having a static analyzer to
check that the specificity matches the declaration order. (Assuming that's
possible... it's not obvious to me if there are theoretical roadblocks.)
~~~
stefanfisk
Stylelint actually has this already! [https://stylelint.io/user-
guide/rules/no-descending-specific...](https://stylelint.io/user-
guide/rules/no-descending-specificity)
------
calibas
I maintain sites with thousands of CSS rules and over a dozen style sheets,
some of which are hosted on external domains that I have no control over.
Having to manage CSS rules based upon order would be a nightmare.
It does seem that specificity isn't well understood, as I often see people
abusing !important.
~~~
cryptica
Agreed.
The article claims that "Importance also makes a lot of sense" but actually,
the way it was implemented in CSS makes no sense because eventually, given a
sufficiently old stylesheet, all styles end up being marked with !important
and then you're back to relying on source-ordering.
------
bestest
Specificity and cascading become redundant once we talk css modules. I work
with CSS modules so I can't remember the last time I had to_ cascade_ style
sheets, or _think_ about specificity.
------
ChrisMarshallNY
In my experience, very, _very_ few Web designers actually use specificity
_(UPDATE: I should have added "properly," as was pointed out)_.
It works extremely well, _if everyone follows the rules_ , which is uncommon.
That goes for most of CSS; not just specificity.
CSS is incredibly powerful, if used properly, and specificity, when actually
used properly, is very cool.
About ten years ago, I wrote this series:
[https://littlegreenviper.com/miscellany/stylist/introduction...](https://littlegreenviper.com/miscellany/stylist/introduction-
to-specificity/)
It’s still absolutely relevant nowadays, and just as few people follow that
workflow now, as they did then.
CSS, in general, is too complicated (IMNSHO), but that complexity is also what
makes it so powerful.
I’ve always enjoyed Stu Nicholls’ CSSPlay site, for examples of extreme CSS:
[http://www.cssplay.co.uk/menu/](http://www.cssplay.co.uk/menu/)
~~~
tiborsaas
I find this really hard to believe. Specificity is baked into CSS at such a
fundamental level that you can't not use it. Maybe you meant like they don't
use it consciously of effectively?
~~~
ZenPsycho
there’s using it deliberately as a tool, and running into it unexpectedly and
doing absurd things to work around it like it’s an obstacle to overcome. i see
much more of the latter than the former; squarely i think because most css is
written by people who don’t know how to use specificity - so they come up with
things like BEM, or css in js frameworks to force specificity away like it’s a
flaw in the system. all this work goes in because module scope is an easier
mental model to grasp than specificity, which i suppose works more like
traits. but, having been in this web making game long enough, it seems like a
return to table based design and font tags. just with different syntax.
~~~
ChrisMarshallNY
People have been freaking out about table-based design for twenty years. It’s
a great foil, and much of the criticism is merited, but it’s not the end of
the world.
You can fairly easily turn tables into blocks:
[https://littlegreenviper.com/miscellany/stylist/another-
reas...](https://littlegreenviper.com/miscellany/stylist/another-reason-to-
use-table-based-layout/#dont-tell-mom)
Also, I use _“display: table”_ all the time in my work. It’s the best mode for
layout that fits content.
~~~
ZenPsycho
you’re kind of missing my point. which is fine because i didn’t spend a whole
lot of words explaining it.
i am not saying table based design is _bad_. but we did originally move away
from it for legitimate reasons that seem to have been largely forgotten, and
as a result we now have developers who build json theme files and react
frameworks that wire “darkmode=true” individually through every component
instead of just using css for what it was designed to do: to exactly be a
theme configuration format.
a lot of where things have gone wrong is CSS started to be used for layout,
which was originally a hack, abusing the float property to do things it was
never designed to do. and css has never gotten good at it, because
fundamentally layout is about relationships between elements, while CSS is
stuck only ever specifying properties ON elements. sure we have display table,
flexbox and css grid, but css is an incredibly awkward and unnatural way to
express those concepts. app frameworks deal with layout seperately from colors
and fonts, which is how it should be done. and so this is why it’s all hacks
and workarounds in css land. it doesn’t have to be this way. but it’s how it
is.
~~~
orange8
> a lot of where things have gone wrong is CSS started to be used for layout
The layout is part and parcel of the style. CSS gives you 7 different layout
modes to choose from (normal, table, float, positioned, multi-column, flex and
grid). There is more than enough here to implement any sort of layout problem
you face, from using 15 year old table and flex hacks to cleaner modern
approaches like flex.
> while CSS is stuck only ever specifying properties ON elements
The "position: relative" (CSS1), float and flex are all about positioning
elements relative to one another. Other properties like margin, align and
float are also about positioning items relative to one another.
> but css is an incredibly awkward and unnatural way to express those concepts
I think CSS is pretty easy and straightforward to reason about, once you
decide that it is worth investing some time and effort in learning. It really
always surprises me how some really smart people just do not get CSS, and I
think it is not about ability, but attitude. CSS is dismissed as "that thing
for designers", and calling CSS a programming language is usually met with
smirks. That is the real problem with CSS, its reputation.
~~~
ZenPsycho
“float” and “normal” (what?) aren’t layout modes. the rest didn’t exist in css
until relatively recently. postion: relative exists, yes, but it doesn’t do
what you say it does. - it sets the top, left, bottom and right properties to
be relative _its own_ natural position. it doesn’t specify any relationship to
any other element. the other properties you say specify layout relative other
elements- don’t do what you think they do either. align and margin only
effects an element’s content, margin only refers to other elements in edge
cases, and float was only ever supposed to let you embed figures that text
would wraparound. it’s use for layout was an abuse and not what it was
designed for. Though i can see how it can seem like those elements seem like
they refer to other elements, but that is just a consequence of the properties
interacting with the document flow algorithm, which is there with or without
css.
while css now does have real layout modules: flexbox and grid, CSS wasn’t
designed for layout and it just isn’t very good at it, especially if you
compare it to say, cocoa autolayout, or the flexbox model in UI frameworks
that were designed to do this stuff at the start instesd of having it
awkwardly bolted on. and just what is document flow? is it a layout algorithm?
no, it’s a greedy word wrap algorithm applied to boxes. that’s it.
finally, 15 year old display: table and flex hacks? what are you smoking. ie8
was released in 2009 and didn’t uproot ie6 and ie7 until many years later. i
had to have fallbacks for ie7 up to 2015. flex only became _usable_ after
2015. it existed before that but not in _all the browsers_.
the closest thing in css to what I am talking about is position:absolute,
which implicitly refers an element to its nearest parent with either position
relative or position absolute. i don’t get to identify which element or
elements i would like to refer to directly.
the only other thing is percentage unit which refers to a percentage of the
paren’s width or height depending on which property it appears in- which was
annoying enough they almost fixed it with vw and vh units which refer
specifically to width and height of the _viewport_ , but if i want a
proportion of any other element’s dinensions my option is to use javascript or
eat a bag of donkeys.
~~~
orange8
> “float” and “normal” (what?) aren’t layout modes.
[https://developer.mozilla.org/en-
US/docs/Web/CSS/Layout_mode](https://developer.mozilla.org/en-
US/docs/Web/CSS/Layout_mode)
> the rest didn’t exist in css until relatively recently.
Out of the 7 layout modes, only the last two came into existence recently. The
point is, when CSS was being designed as a declarative, domain specific
programming language, it was designed to handle EVERYTHING to do with how
things appear on a browser. Even when you specify styles using JS or directly
in HTML, those are simple handed over to the browsers CSS engine. The HTML
engine deals with markup, while the JS engine deals with behavior. If you want
to talk directly and powerfully to the CSS engine, use CSS. That what it was
designed to handle: everything to do with what you SEE in a browsers screen.
> position: relative exists, yes, but it doesn’t do what you say it does. - it
> sets the top, left, bottom and right properties to be relative its own
> natural position.
If I want to position elements absolutely within a container element, one way
is to have the container "position: relative" and the child elements
"position: absolute". The same effect can be achieved via flex or grid
layouts.
> it’s use for layout was an abuse and not what it was designed for.
I am honestly flabbergasted by this assertion. The layout of something is part
of its style. See
[https://en.wikipedia.org/wiki/Visual_design_elements_and_pri...](https://en.wikipedia.org/wiki/Visual_design_elements_and_principles)
CSS was designed to handle all that. Layout is all about shape, space and
form.
> CSS wasn’t designed for layout and it just isn’t very good at it, especially
> if you compare it to say, cocoa autolayout, or the flexbox model in UI
> frameworks that were designed to do this stuff at the start instead of
> having it awkwardly bolted on.
Whenever anyone dumps on CSS, HTML and JS, I simply remind them that they run
the web, and the web is the most successful, open, flexible and used platform
in existence. CSS is doing exactly what it was designed to do, and will
outlive and outperform all those UI frameworks as it breaks out of the browser
into desktop and mobile app space.
~~~
ZenPsycho
look, i love CSS. Not dumping on it. but you have a bit of stockholme syndrome
going on here. sometimes loving something means realistically looking at its
flaws and limitations. and you seem to have a distorted view of history, and
what a “stylesheet” is. (they existed for decades before CSS was invented, and
come from the tradition of printing and writing, not design so much. they do
not, generally speaking, specify layout. layout specifies layout.) from the
start, browsers didn’t have a css engine. css was added _later_ , and has
never fully exposed every piece of the browser display engine. some of the
browsers needed to be rewritten from scratch for this to even be possible. as
for layout, from the start and for a very long rime, the position of the w3c
and browser authors is that website authors _shouldn’t want_ layout and
refused to implement any layout capabilities. in their view, layout should be
left up to the client, and a website author should focus only on writing plain
semantic documents. they told people to stop using floats for layout because
it got in the way of client decisions about layout. they have only been added
grudgingly, and after the forced removal of the people who were blocking it. i
for one welcome these layout capabilities, and am glad the old farts didn’t
get in the way. that doesn’t mean we’re up to par with the best layout engines
though.
your link makes only one reference to style: “style of shape”, and a passing
reference to
[https://en.m.wikipedia.org/wiki/Style_guide](https://en.m.wikipedia.org/wiki/Style_guide)
with regard to keeping non layout elements consistent, and that page in turn,
which makes no reference to layout.
~~~
orange8
> look, i love CSS. Not dumping on it. but you have a bit of stockholme
> syndrome going on here. sometimes loving something means realistically
> looking at its flaws and limitations.
I find that very hard to believe, sorry. And I could write a whole book about
the flaws and limitations of CSS, and then another about its elegant power and
gradual evolution over the years, but you most likely wont want to read it.
You are stuck in 2002, completly fixated about the origins or the languange
and what it was meant or not meant to do 20 years ago.
> and you seem to have a distorted view of history, and what a “stylesheet”
> is.
Right now, I am honestly not interested in the semantic meaning of the word
"stylesheet" or its etymology, or the history of computing and printing. Am
here to discuss CSS.
> from the start, browsers didn’t have a css engine. css was added later, and
> has never fully exposed every piece of the browser display engine.
Is this not obvious? A browser is made of three main engines JS, CSS and HTML.
Being a GUI, all parts will influence the GUI. That does not change the fact
that the CSS engines domain is what you see. The JS domain is interactivity
(what you do with what you see) while HTML is about data and structure.
> some of the browsers needed to be rewritten from scratch for this to even be
> possible.
Can you name even one thing in computing that's over 20 years and has not
changed or evolved? If your problem with CSS is that browsers had to be
rewritten 20 years ago to evolve with the language, then that problem applies
to every other successful language and GUI framework or system under the sun.
> website author should focus only on writing plain semantic documents. they
> have only been added grudgingly, and after the forced removal of the people
> who were blocking it.
Semantic documents are under the domain of HTML. This has nothing to do with
CSS.
------
amflare
I feel like source order should be the thing that is done away with.
Specificity allows me to style something, and trust that style will style
correctly no matter what my build system does, or what order my stylesheets
load, or what my co-worker adds later for their fancy new CTA button.
------
specialist
Nice.
Facepalm slap. Just like other matching algorithms. How did I not notice that
earlier?
Order independent specificity is like longest match rules. Regex, lexing, URL
routers, etc.
Order dependent matching overrides (correct phrase?) is like Packrat & PEG.
Which is better depends on ambiguity, meaning how well your "matcher"
algorithm can process whatever input you have.
I've done my fair share of scraping. (I usually default to my own globbing
implementation. Generally more simple than xpath or css expressions.) Now I'm
feeling pretty stupid that I'd always hard-coded precedence resolution.
------
Etheryte
While this is an interesting thought experiment, and the outlined problem is a
real issue, I don't think the outlined solution is a good way to resolve it.
The main problem here is predictability: maybe it's simply that I'm not used
to thinking about CSS this way, but the proposed CSS rewriting parser means
it's not immediately obvious how what I write maps to the end result.
It's entirely possible that I'm missing some of the benefit here, but it seems
to me that scoped styles etc largely already solve this problem.
------
dwd
The simple solution is delete before you add more specificity. Cut it back to
the common denominator and cascade from there.
Of course, you can sometimes inherit a system that is one big specificity
mess.
------
CerebralCerb
Specificity isn’t intuitive, and—especially for new developers—the results can often seem like a gotcha rather than the intended behavior. I’m also not sure there’s an equivalent in other systems or languages.
It is widely used as a conflict resolution strategy in production systems.
I've never encountered a student who found specificity to be a difficult
concept to grasp.
------
atrilumen
I really admire [http://tachyons.io](http://tachyons.io), with its principles
of _shallow cascade_ and _low specificity_.
( It really sucks to have to jump in and work with some project's massive blob
of vestigial, conflicting styles. )
------
z3t4
I always write the specificity in the same order as the cascade. Could
probably write a tool that compares the specificity vs the cascade to find
errors!
There is however a small problem if you want to use many CSS files - that you
need to link them in the correct order.
edit: Also I don't use ID's in CSS.
------
timwis
But what if you put your styles in multiple files?
| {
"pile_set_name": "HackerNews"
} |
Tell HN: HackerNews is the greatest site to browse in Cuba - ChicagoBoy11
I am in Cuba for a week for business and pleasure. Internet here is, unsurprisingly, incredibly scarce and insanely expensive. Even for foreign businesses, 1mb connections can cost in the thousands of dollars a month.<p>In my downtime, I have tried surfing the web, and every experience under the sun has been pretty painful, even for sites technically optimized for slow connections like Gmails HTML view.<p>But damn it, I´ve been browsing HN just as well as if I had been back home.<p>Thanks, dang
======
ChicagoBoy11
On a sidenote, you cant help but walk around here and also feel that, should
commercial relations someday be fully normalized, there is going to be an
economic boom here unlike any the world has ever seen
------
ck2
Is Cuban internet censored like China?
Surprised there isn't sat internet more available there, you can get that on
even remote islands in other parts of the world for a couple hundred US$ per
month.
~~~
dalke
[https://en.wikipedia.org/wiki/Internet_censorship_and_survei...](https://en.wikipedia.org/wiki/Internet_censorship_and_surveillance_by_country)
and
[https://en.wikipedia.org/wiki/Internet_censorship_in_Cuba](https://en.wikipedia.org/wiki/Internet_censorship_in_Cuba)
give some specific details about censorship on the Cuban internet.
The only direct comparison to China is in the latter, which says:
> Reporters Without Borders suspects that Cuba obtained some of its internet
> surveillance technology from China ... However, it should be noted that Cuba
> does not enforce the same level of internet keyword censorship as China
but also:
> Rather than having complex filtering systems, the government relies on the
> high cost of getting online and the telecommunications infrastructure that
> is slow to restrict Internet access
You'll need to decide for yourself what "like" means.
If I read [http://laredcubana.blogspot.com/2013/11/ilegal-satellite-
int...](http://laredcubana.blogspot.com/2013/11/ilegal-satellite-internet-
service-in.html) correctly, satellite internet is illegal, and expensive:
> getting the equipment in and installed costs between $3,500-$4.200, paid in
> advance in Miami. The bills are generally paid for by families members who
> live in the US and it seems that the motivation is purely business -- cheap
> phone calls and Internet access -- not political.
------
bbcbasic
It's also the best site on my phone deep in a train tunnel where there is some
connection but very slow!
------
brador
Try [http://skimfeed.com](http://skimfeed.com)
Just browsing the titles is enough to get a feel for the days events.
~~~
ericzawo
Wow, bookmarked!
| {
"pile_set_name": "HackerNews"
} |
Geeksta' Rappers Rhyme Tech Talk - vlad
http://graphics.stanford.edu/~monzy/eetimes-geekrap.pdf
<a href="http://graphics.stanford.edu/~monzy/DramainthePhD.mp3" rel="nofollow">http://graphics.stanford.edu/~monzy/DramainthePhD.mp3</a>
======
vlad
<http://graphics.stanford.edu/~monzy/DramainthePhD.mp3>
| {
"pile_set_name": "HackerNews"
} |
Blueprint – A React UI toolkit for the web - ika
http://blueprintjs.com/
======
TheAceOfHearts
I see a few comments bringing up mobile. There's already many mobile-friendly
UI frameworks; not everything has to be mobile-first. Yes, in many cases it
makes sense to go for a mobile-first approach, especially with consumer-facing
applications. Mobile is huge and apparently it's still continuing to grow.
At the same time, there's many legitimate cases in which you want to optimize
for desktop. For example: consider an IDE, where you have lots of panels and
toolbars. In some cases it's unclear how you'd be able to support both desktop
and mobile without significantly degrading the experience. Even Google
struggles with this. How usable is Data Studio [0] on mobile? It's pretty
terrible and unusable. But that's perfectly fine, because the desktop
experience is great! I can't speak for others, but I can't imagine myself
wanting to use a mobile device to get any work done.
Kudos to Palantir for open sourcing their UI toolkit.
[0] [https://datastudio.google.com](https://datastudio.google.com)
~~~
minimaxir
For internal projects/webpages (similar to Google Data Studio as mentioned),
where you would expect the user to _only_ be using it on a desktop, having no
mobile support is fine.
But for external projects/webpages in 2016, where atleast 1/3 of usage can
come from mobile devices, having a lack of mobile support is a complete, 100%
nonstarter. And there are plenty of competing React UI frameworks with mobile
support already.
~~~
eyko
> And there are plenty of competing React UI frameworks with mobile support
> already.
Citation needed.
In all honesty though, good quality UI frameworks with good mobile support are
on the top of my "#want" list.
~~~
woah
There are a bunch of bootstrap components ported to React in the Reactstrap
project.
------
renke1
This looks really nice.
However, what I really want for React is a style-agnostic component library
that basically extends the regular set of HTML elements, but comes with no
"visual" styling (other than really basic styling like the browser's default
styling for <input>, <select> and the like). Only styling that is necessary
for the component to function should be included.
Of course, optional themes would be fine. Also, non-visual styling should be
completely inlined. Themes could inject "visual" styling via context. User-
defined styling would be passed via class or style props.
~~~
Klathmon
Styling is still "unsolved" with react IMO.
Inline styles feel wrong, CSS alone isn't encapsulated enough to work with
components correctly, CSS modules are TOO encapsulated which makes global
styles and themes a royal pain, and adding another layer ala SASS or LESS
feels like more of a "patch" vs a real solution.
And none of them really solve style inheritance in any way that I'd call
elegant.
I end up using SASS and overriding default styles with weird hacks like using
`.button.button.button` or (even worse) using `!important`, but it still feels
wrong and doesn't scale very well at all.
~~~
pault
I know everyone has their own favorite method, but one that I'm extremely
satisfied with is using webpack > sass-loader > extract-text[1] to `include
Styles from './MyComponent.scss'` in each component file, and then it all gets
bundled up into a single css file per target (also great for eliminating dead
CSS!). I use "layouts" at the root of my react hierarchy (under stores and
routers and whatnot) and put my global styles there.
I haven't run into a situation using this setup where I felt like I needed a
dirty hack to make something work. It does add a layer of complexity to the
build, but if you can get it working once you can just copy paste it into
every new webpack config, and It feels very natural and tends to organize
itself.
I am really not a fan of this new "css in your js" approach that the cool kids
are using, but I guess I'm just getting old.
[1] output of the following config will be:
dist/
|_app/
|_bundle.js
|_bundle.css
|_admin/
|_bundle.js
|_bundle.css
```
const webpack = require("webpack");
const ExtractTextPlugin = require("extract-text-webpack-plugin");
const ExtractCSS = new ExtractTextPlugin("[name].css");
module.exports = {
entry: {
"./dist/app/bundle": "./app.js",
"./dist/admin/bundle": "./admin.js",
},
output: {
path: __dirname,
filename: "[name].js"
},
module: {
loaders: {
test: /\.scss$/,
loader: ExtractCSS.extract(["css", "sass?sourceMap"])
}, {
test: /\.js$/,
loader: 'babel',
exclude: /node_modules/,
include: __dirname
},
},
sassLoader: {
includePaths: "source/styles",
sourceMap: true
},
plugins: [ ExtractCSS ]
}
```
~~~
Klathmon
That is what we do as well, but if you use a UI toolkit that includes it's own
styles, you'll need to override them. That's where the fun hacks like
.button.button.button come in to override the included styles.
~~~
pault
Oh yeah, I see what you're saying. I just don't bother with toolkits since I
find that once you factor in said fun, you aren't really saving much time, and
you might end up with a lot of jank. I just roll my own based on the needs of
the project and keep my cascade very flat and specific. Sass makes this really
easy using BEM style naming, where you can do this:
.my-component {
&_component-item {
&--open {
}
}
}
and you get a nice, flat output:
.my-component {}
.my-component_component-item {}
.my-component_component-item--open {}
I wish framework authors would adopt this approach as it completely eliminates
specificity conflicts.
------
butu5
Thanks very much @Plantir for open sourcing React UI toolkit. This already
seems to be production ready. Big thanks for the detailed documentation. The
amount of time you have spent in creating examples and providing excellent
starting point is really great.
I really enjoyed and excited to see your take on Color Theme. Like bootstrap,
I don't think sites created using Blueprint will look same. With minimal
changes in variables,layout and using your wide ranging color theme wonderful
results can be achieved in no time.
Big Thumbs up!!
I have created a small overview videos about various UI component available.
(No installation or tutorials, only shown their various artifacts).
[https://www.youtube.com/watch?v=ky7ec5Sh2kM](https://www.youtube.com/watch?v=ky7ec5Sh2kM)
------
jasonkillian
Hi all, I work at Palantir and worked on Blueprint (although I'm currently
working on a different project). Happy to answer any questions you have about
it.
Just a note - we didn't intend to heavily publicize the project quite yet. For
example, the docs site is still a WIP, so apologies if it causes issues for
you on mobile devices.
~~~
literallycancer
Just a nitpick, but when looking at the docs, I can't move with Page Up/Page
Down unless I click in the "content column/area", kind of annoying when one
navigates using the left bar and then can't move around (on no-mouse setups,
e.g. trackpoint + keyboard). So you might want to look at that.
Otherwise no issues (older laptop i5/chrome), so I guess you guys mostly fixed
that by now.
~~~
jasonkillian
Good catch, we'll take a look at this. I've filed an issue to track it [0].
[0]:
[https://github.com/palantir/blueprint/issues/122](https://github.com/palantir/blueprint/issues/122)
------
dictum
Ah, memories. [http://blueprintcss.org/](http://blueprintcss.org/)
~~~
Jgrubb
That was the first CSS grid framework I remember coming across. Last updated
in 2011, it's not responsive at all it seems. Completely staggering how far
things have come in 5 years.
------
rq1
Your homepage pushed my CPU to 75% and resulted in an early click on the red
cross of the tab. (latest chrome stable/osx 10.11) You should do better.
~~~
3D4y0
Same thing happened to me (latest chrome/ubuntu 14.04)
------
mstijak
This is very slick. It's similar to my own project
[http://cx.codaxy.com/docs/widgets/date-
fields](http://cx.codaxy.com/docs/widgets/date-fields), however, it seems that
they went one step further.
~~~
crudbug
Impressive work.
[0] [http://cx.codaxy.com](http://cx.codaxy.com)
~~~
mstijak
Thanks. Here are a few more links for things implemented with Cx:
[http://cx.codaxy.com/fiddle/?f=vwyHzOO1](http://cx.codaxy.com/fiddle/?f=vwyHzOO1)
[http://cx.codaxy.com/starter/dashboards/sales](http://cx.codaxy.com/starter/dashboards/sales)
[https://codaxy.github.io/state-of-
js-2016-explorer/](https://codaxy.github.io/state-of-js-2016-explorer/)
[https://mstijak.github.io/tdo/](https://mstijak.github.io/tdo/)
------
config_yml
This is impressive, I can cover lot's of UI scenarios with it. I also like the
first class CSS/HTML API that they have, so I could use them without React.
And they seem to have Accessibility covered (need to dive in a bit more), but
this is super important. And at last, I dig the visual style, which has good
affordances. A nice contrast to Material Design.
------
gburt
Blueprint is an open source project developed at Palantir.
~~~
revelation
Haha, I'm sure some intern at Palantir is now also writing a Java SWT to
Blueprint translator. The poor soul.
~~~
Ciantic
Notice that this toolkit is written in TypeScript.
------
k__
I wish web based UI-toolkit creators would focus on WebComponents. They can
easily be used as leaf-elements in almost any framework.
~~~
xliiv
+1 for web components and Polymer
[https://beta.webcomponents.org/](https://beta.webcomponents.org/)
~~~
Fifer82
A lot of people go on about Polymer, but no one actually uses it?
~~~
ergo14
Hm, some of the biggest enterprises in the world and industry leaders use it.
I'm not sure why this is repeated on HN all over and over.
[https://youtu.be/VBbejeKHrjg?t=9m32s](https://youtu.be/VBbejeKHrjg?t=9m32s)
~~~
xliiv
BTW, Chrome browser's: PDF reader, download listing, new settings are Polymer
afaik.
Google uses [https://gaming.youtube.com/](https://gaming.youtube.com/)
IBM uses
[https://console.ng.bluemix.net/catalog/](https://console.ng.bluemix.net/catalog/)
~~~
ergo14
Theres tons more. Btw. "main" youtube is also switching to polymer.
------
sehr
If someone wants to be an exemplary citizen, rendering all of the components
on to a single page for us on mobile would be amazing.
~~~
ivan_ah
Yes, I was looking for a "kitchen skin" page and was disappointed not to find
one.
------
simple10
This looks great. Nice to see a complete UI toolkit with context menus,
sortable tables, hotkeys and editable text along with all the normal tooltips,
navs, and UI widgets. Bonus points for both sass and less support. Looking
forward to trying this out.
------
vesak
Is this page written in Blueprint? It makes my Firefox consume 100% CPU.
~~~
rainings
I am using latest FF too. Very lag
------
SwellJoe
Despite the sluggishness, there's a lot of really likeable stuff in here and
it is beautiful (or, a lot more beautiful than I, or most developers, could
ever come up with without a team and some pro designers on staff).
I particularly enjoyed the piano example:
[http://blueprintjs.com/docs/#components.hotkeys](http://blueprintjs.com/docs/#components.hotkeys)
------
codycraven
Documentation is unusable on mobile... I hope that was due to someone making
some CSS mistakes with their page code and not the framework's code.
~~~
hornbaker
The framework, while polished, lacks any responsive grid components. A bit of
a glaring omission imho, in this age of Bootstrap and Semantic-UI giving you a
responsive grid out of the box.
------
karmajunkie
Maybe just me but I'm having a hard time seeing how this is substantially
different from or better than bootstrap or foundation with some react-based
wrappers like reactstrap. Seems like performance is an issue and you're just
as tightly coupled to the framework as with bootstrap/foundation. Am I missing
something?
------
GordonS
I tried to look at the site on my S7, but it ground it to a halt. Tried on my
i7 laptop, same result :/
------
olalonde
Unfortunately, mobile support is not on the roadmap:
[https://github.com/palantir/blueprint/issues/105#issuecommen...](https://github.com/palantir/blueprint/issues/105#issuecomment-260151357)
~~~
DeBraid
> Just as a caution though, the library, in general, is intended for desktop
> web applications. We haven't made mobile-compatibility a priority
> [https://news.ycombinator.com/user?id=jasonkillian](https://news.ycombinator.com/user?id=jasonkillian)
Presumably since React Native a solid mobile alternative, but shouldn't
2016-vintage web UI frameworks be responsive?
I guess if you're a big company like Palantir, you have the resources to do
native mobile, so I see why they aren't making it a priority.
~~~
olalonde
Yes, I would really like to use this but don't have the time or resources to
build and maintain a separate mobile UI. With bootstrap/react-bootstrap,
things usually turn out OK/good on mobile without me having to really think
about it.
------
Edmond
I am the developer of HiveMind (crudzilla.com), I am currently stuck on what I
think would be the final feature of the platform that would fulfill my vision
of what a modern web application platform should be about.
That feature is a drag-n-drop UI construction mechanism that would basically
allow the user to glue together UI components (I call them UI parts) and write
code only as a fill-in-the-blank task to complete the application.
I think React, Angular and similar paradigms might eventually make this
possible. In the future building web applications, especially typical business
crud applications with simple workflows wrap around a UI should not require
more than familiarity with a development product and basic programming.
------
foo303
Not to sound like a bummer here, but I'm not sure what kind of computer is
needed to open this webpage. Simply scrolling up and down that website causes
my computer to almost freeze. Firefox 49.0 user here. I would profile it, but
I'm afraid of having to restart the computer as a result.
Edit: It looks amazing. (Still the performance issue is reproducible easily)
~~~
adidahiya
Hi folks, I'm one of the developers of this project -- we hear your perf
concerns loud and clear and are tracking the issues :)
As mentioned elsewhere in this thread, this page was released a little too
early while we were still playing around with animations in the header. I've
gone ahead and disabled them for now, so you should see leaner CPU utilization
now. Thanks for your comments.
~~~
SwellJoe
Even now, it's still pretty sluggish, all around. My machine is, I think, as
big as can be expected (current gen i7, 16GB), though I am on a slow 4G
network. But, as others note, it is gorgeous, and the API appears well
thought-out, and very complete (or, at least, complete for the kinds of
systems UIs I build).
~~~
adriand
Weird, because on my iPhone 6, it works fine with no detectable performance
issues at all.
------
CGamesPlay
The documentation page does render legibly on mobile. It looks nice but that
doesn't bode well for the library...
------
simple10
Question for Palantir devs: is this currently being used in production?
~~~
ethanbond
I'm a designer at Palantir (also write some code), and yes, it's used pretty
much everywhere.
~~~
mwww
Could you please show us a few screenshots? Thanks!
~~~
mlitwin
You can see some designs here
[https://dribbble.com/palantir](https://dribbble.com/palantir)
------
crudbug
ReactJS enlightened web developers with _Component_ based
functional/thinking/development model, which brought some order in front-end
UI development. And it has been one of the best things with community behind
it.
Having worked with both Desktop UI (Swing/GTK) and Server based UI toolkits
(JSF), the value of ReactJS is a standard Component Life cycle.
I would propose having a ReactUI component specification with a standard
version. The current specification [0] is not decoupled from implementation.
The separate spec and implementation numbering will enable multiple
implementations - ReactJS from Facebook being one.
I am thinking similar to Reactive-Streams [0] specification and Reactive
Extensions [2] which have multiple implementations.
[0] [https://facebook.github.io/react/docs/react-
component.html](https://facebook.github.io/react/docs/react-component.html)
[1] [http://www.reactive-streams.org](http://www.reactive-streams.org)
[2] [https://github.com/ReactiveX](https://github.com/ReactiveX)
~~~
rpeden
React wasn't the first. :)
Some of us were out in the wilderness building GWT components back in
2009/2010\. And although compiling to JS was a pretty weird option at the
time, it mostly worked. Your app looked terrible if you just cobbled together
GWT's default components, but if you built your own components with custom
CSS, you could end up with a really nice looking application. We even used
immutability and one-way data flow where we could, though it wasn't culturally
ingrained the way it is in the React community.
I understand why GWT idn't become mainstream. Java was really disliked in the
JS community, even more than it is now. But having a big, maintainable web app
using DI and all other sorts of fun things was actually kind of fun 6-7 years
ago. I like React and Angular 2 better now, though.
------
rpwverheij
This looks really nice and well set up. I love the fact that you chose
typescript and bundle a good definitions file. I totally understand homepage
performance was not on the agenda yet, and its really no big deal to me, but I
have to agree with others on the mobile support: this sadly crosses blueprint
of my list for 99% of my work, if not all. Will be coming back to see where to
project is going though!
------
awjr
Coincidentally I was watching "ReactNL 2016 Max Stoiber - Styling React.JS
applications"
[https://www.youtube.com/watch?v=19gqsBc_Cx0](https://www.youtube.com/watch?v=19gqsBc_Cx0)
Styling in React needs some sort of consistent way forward.
------
ENGNR
Maybe I need to take a few layers of tinfoil off the old hat, and I suspect
it's just good old fashioned open source generosity, but..
Given that it's from Palantir, is there any way this could become a security
attack vector at scale?
~~~
grzm
Fortunately, it's open source, fully open to code review!
------
NicoJuicy
A little bit slow on my mobile
~~~
lucideer
A little bit slow on my laptop, nevermind mobile. Not good signs.
~~~
elcct
Wanted to say this as well. Definitely a deal breaker.
~~~
sotojuan
Not too far off the average SPA experience.
~~~
elcct
That makes me sad
------
petemill
It seems you may have to include the css for the whole library on every
page[0]. I would have much preferred it if the css for each component is
included on your page as you require the component.
Though I admit it is confusing since the "Let's Get Started" section of the
homepage[1] does not mention including the global css.
[0] [http://blueprintjs.com/docs/](http://blueprintjs.com/docs/) [1]
[http://blueprintjs.com/](http://blueprintjs.com/)
------
pinouchon
Impressive.
One question: I couldn't find advanced form components, and specifically
dropdown/multiselects. (a bit like this:
[https://selectize.github.io/selectize.js/](https://selectize.github.io/selectize.js/),
or this: [http://wenzhixin.net.cn/p/multiple-select/docs/#the-
basics1](http://wenzhixin.net.cn/p/multiple-select/docs/#the-basics1))
Do you plan to implement this kind of components?
------
hokkos
This look exceptional ! It's full featured (datepicker, tree) and professional
looking. It's also seems to by typescripted. But the slider needs work on
touchscreen.
------
ika
What I like the most is that the UI is just really nice and clean
------
aagat
There is also [http://react.semantic-ui.com/](http://react.semantic-ui.com/)
which I have found to be very useful.
------
adakbar
Those themes are gorgeous, where I can find those theme?
------
jules
The design looks amazing. I love how easy it is for "programmer ui design" to
look quite good these days.
------
IJP
Great work, it looks very nice! One thing I always find missing from most of
these React toolkits is a TreeTable component. This is a component that I
often require in business applications, and I suspect other will too. Is this
something that you would consider adding in the future?
------
revelation
I wonder why all the UI toolkits always come without some form of layout
support.
What is Qt without a GridLayout?
~~~
ggregoire
What are you talking about?
\- bootstrap contains a grid layout
\- semantic-ui contains a grid layout
\- bulma contains a grid layout
\- purecss contains a grid layout
\- tachyon contains a grid layout
\- skeleton contains a grid layout
\- milligram contains a grid layout
\- spectre contains a grid layout
and the list goes on.
~~~
revelation
That's nice but their idea of grid is to hand you a bunch of fixed-sizes CSS
you can slap on elements. That has very little to do with the actual
_layouting_ a UI toolkit like Qt provides for.
~~~
Kiro
I don't know anything about Qt but if it's so good, why hasn't someone made
something equivalent for web that just generates the underlying HTML/CSS?
------
sebringj
This is very well done. If they provided a React Native one as well, that
would be even cooler.
------
leesalminen
Docs are utterly broken on mobile.
------
ckluis
Everything looked good until the grid/table. Those are vastly
underpowered/featured versus other solutions (kendoui, devexpress, etc) we are
currently using/have evaluated.
------
okigan
Cool/interactive logo on the web page -- how is that done ?
~~~
simple10
Canvas.
------
enturn
My RT-AC68U router flagged this website as having malware. Seems more related
to email than the website though.
Description: Sites whose addresses have been found in spam messages.
------
nawitus
For some reason this page is really slow on Firefox.
------
Numberwang
This looks amazingly good and well thought through.
------
jbverschoor
Site is super slow and jaggy on a macbook pro..
No demo's
~~~
alfonsodev
I was about to give up, but I found in documentation, one of the last link on
the side bar, is Components, there you can see a demo of good list of
components.
[http://blueprintjs.com/docs/#components](http://blueprintjs.com/docs/#components)
------
WhitneyLand
Since you guys are taking a lot of insults here I want to try and offer you
something constructive.
Having bugs is ok. Failing at mobile and performance is not. It melts away
your credibility because doing these things right is table stakes.
This is all compounded by the fact that it's a toolkit that serves as base for
other developers, rather than just a slow app.
Finally, your flippant response to criticism gives the impression that you
don't understand or care about craftsmanship.
However, thanks for making the contribution. Look forward to trying your next
major release.
~~~
CSDude
Not all things are supposed to be mobile. All the products we write are
supposed to run on a large display for our case.
~~~
robbrown451
I still can't imagine tying yourself to a system that precludes mobile. That
seems like a terribly bad idea at this point.
~~~
ethanbond
For certain applications, you're right that's an awful idea.
For others, it's an equally bad idea to limit your toolbox based on an
irrelevant criterion.
~~~
robbrown451
I question how truly limiting that is. I mean, if you have a framework or
library that has the restriction that it won't work well on mobile -- unless
there is a really good reason for that restriction -- it seems to me that that
framework/library probably isn't ready for prime time and will either gain
mobile support, or will soon disappear.
------
mcs_
why did you publish this on Saturday??? now i have no excuse to try to replace
react-material...
------
moondev
This is really slick and fresh looking. Almost inspires me to build something
with it
~~~
ethanbond
You should go for it! It actually _feels_ good to use, which is one of those
rarer qualities to come across in these sorts of libraries.
Disclaimer: I'm a designer at Palantir and have been using it for quite a
while.
------
milesdyson_phd
very neat, i am definitely going to be trying this out on a new side project
------
cryptozeus
Not loading on safari ios 9
------
yummybear
The text is garbled (on top of eachother) on safari ios.
------
samfisher83
Trying to load that site is slowing down my browser.
------
bytelayer
Is it possible to use this without React/NPM?
------
jasikpark
Smooth as butter in safari on my iPhone se.
------
andrethegiant
Very impressive! Keep up the good work.
------
alfonsodev
Thanks for this I see a great value on this, you got my github star.
------
harrisrobin
This is amazing!
------
sauronlord
Renders and scrolls slow
------
pokebowl
I saw no favicon and closed this thing immediately.
Seriously guys, it's 2016 and you still don't have a favicon. Not having a
favicon on your docs page is plain bad for a few reasons:
1\. I have zero context about your page. If I don't have that 20x20 square on
my tab anchoring me to reality, my 2 second attention span will have made me
forget where I am even before I load your page on my internet-connected
toaster oven's 7 segment display. Not. Cool.
2\. No support for 7 segment or e-ink displays. WTF?? How am I supposed to use
this for IoT applications, like my toaster oven[1] or my dishwasher.
3\. This is how it shows up on my kindle:
[http://67.media.tumblr.com/tumblr_lhw2rvgsnu1qzhofn.jpg](http://67.media.tumblr.com/tumblr_lhw2rvgsnu1qzhofn.jpg)
ARE YOU INCOMPETENT??
[1] See bullet point 1
[2]
[https://static.googleusercontent.com/media/research.google.c...](https://static.googleusercontent.com/media/research.google.com/en//archive/mapreduce-
osdi04.pdf)
[3] [http://jepsen.io/](http://jepsen.io/)
[4] [https://palantir.com/spying-on-my-shit](https://palantir.com/spying-on-
my-shit)
~~~
dang
> _I saw no favicon and closed this thing immediately. Seriously guys, it 's
> 2016 and you still don't have a favicon._
This is the sort of knee-jerk dismissal that the Show HN guidelines ask you to
avoid when commenting here:
[https://news.ycombinator.com/showhn.html](https://news.ycombinator.com/showhn.html)
Favicons have their importance but it's just rude to swipe someone's entire
work off the table this way.
Edit: Doh. Carry on.
~~~
DanBC
I mean, it's obviously satire, and it's taking a shot at something you've
previously said is damaging to HN.
[https://news.ycombinator.com/item?id=9238739](https://news.ycombinator.com/item?id=9238739)
I'd agree that pokebowl's comment is a suboptimal way of getting that point
across.
~~~
dang
> _obviously_
Oops. I completely missed that.
This is what comes of moderating in haste.
------
AzzieElbab
Safe to say it sux.
~~~
dang
Please don't post uncivil, unsubstantive comments to Hacker News. We ban
accounts that do this repeatedly.
[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)
[https://news.ycombinator.com/newswelcome.html](https://news.ycombinator.com/newswelcome.html)
------
macattack728
Cool stuff
------
lai
TBH, I expected more from what's supposed to be a Palantir project. I mean
come on, why is the site not mobile-first?
~~~
bearly
99% of our use cases are desktop-only – we build analytical tools. Our focus
is on making those experiences great.
| {
"pile_set_name": "HackerNews"
} |
GammaThingy – Open Source f.lux for your iPhone - nkron
https://github.com/thomasfinch/GammaThingy
======
walterbell
Screen temperature is a public health feature which should be implemented by
the OS. Until this feature arrives in iOS, the next best option is for apps to
implement it. The ebook reader Marvin and Koala web browser both have controls
for color temperature.
~~~
andrebalza1
Opera Mini browser has a night mode too, it dims screen and lowers
temperature.Works excellent. Feel free to add to the list
------
notable_user
It's NOT open source as per the readme: "I purposely have not added an open
source license to the project and am intentionally retaining copyright for the
code, meaning that it may not be redistributed (in code or compiled form)
without my permission."
~~~
corv
Confusing!
The author states: "You may think it's dumb to have open sourced it then [...]
even if it's under copyright. [..] So basically, it's open source so people
can learn from it.
So this is public source code which is under copyright.
~~~
stevetrewick
AIUI most FLOSS code is under copyright [0] the distinction being that the
author(s) grant others a licence to use, copy and distribute under certain
restrictions.
It seems that the author here wants people to be able to read the code, but
not distribute it, primarily because he does not wish to tread on the toes of
f.lux' developers.
[0] In fact, AFAICT unless a creator specifically and explicitly waives such,
all work is copyright, because Berne treaty.
------
jonah
So, what is it doing that f.lux isn't/can't on non-jailbroken phones? I know
they'd need access to APIs that Apple doesn't make available.
~~~
wingerlang
I haven't looked into it, but I would assume it is hooking into private
frameworks and this is simply not allowed from an app on the App Store.
~~~
stevetrewick
Yes, it uses IOKit which is a private API on iOS.
| {
"pile_set_name": "HackerNews"
} |
Twitter quietly rolls out algo timeline - jlas
https://twitter.com/settings/account#personalize_timeline
======
jlas
Configurable under the Timeline option in account settings, also mentioned in
the docs:
[https://support.twitter.com/articles/164083#settings](https://support.twitter.com/articles/164083#settings)
| {
"pile_set_name": "HackerNews"
} |
Researchers explain why bicycles balance themselves - flyingyeti
http://www.news.cornell.edu/stories/April11/bicycle.html
======
dmlorenzetti
This publicity release mis-advertises what is new here.
From the publicity release: _Now, a new analysis says the commonly accepted
explanations are at least partly wrong. The accepted view: Bicycles are stable
because of the gyroscopic effect of the spinning front wheel or because the
front wheel "trails" behind the steering axis, or both._
From a Wikipedia article
(<http://en.wikipedia.org/wiki/Bicycle_and_motorcycle_dynamics>): _In 1970,
David E. H. Jones published an article in Physics Today showing that
gyroscopic effects are not necessary to balance a bicycle._
In fact, Jones' article addresses the steering axis question, as well.
What appears to be new here is the mathematical analysis. From Jones: _I have
not yet formalized all these contributions into a mathematical theory of the
bicycle..._
------
wiredfool
I had Ruina as a dynamics prof many years back. Good teacher, Good class.
I still remember that he was interested in self walking robots at the time,
powered only by gravity and walking down a slight slope.
~~~
kragen
How do those work? With the Jansen linkage?
~~~
wiredfool
I remember a three legged version, with the inner leg out of phase with the
outer two. This is a more refined two legged version,
<http://www.youtube.com/watch?v=_2pAMe_5VeY> .
Power here is gravity, in the form of a slight slope on the surface it's
walking on. The walking form is all from dynamic motion, no electronic
control.
It looks simpler than a jansen linkage, much more like the physiology of our
legs. (and there's a funny story there, Prof Ruina had a limp when he taught
the class, because he had implanted studs in the bones of one of his legs so
that they could motion capture the bone motion when he walked. Not sure if the
limp affected that.)
------
hoag
This is awesome: I have seriously been searching for an explanation on this
ever since I was a little kid and my dad bought me my first bike at about 3
years of age and immediately removed the training wheels.
------
vlisivka
> They built a bicycle with two small wheels, each matched with a counter-
> rotating disk to eliminate the gyro effects
...to double gyro effects... or I missed something?
Two conter-rotating disks will compensate each other, BUT they must be
perfectly aligned, is not?
In their bicycle, upper disk produces about 2x of gyro effect comparing to
lower disk gyro effect, because it moves with about 2x larger amplitude
comparing to lower disk at same incline, thus we still have about 2x-1x=1x
gyro effect.
They should try their system on skates with two tiny blades instead of wheels.
PS.
Sorry for my English - I am Ukrainian.
PPS. System need to spent some energy to restore vertical alignment of
inclined body.
Thus they need to add small incline to bicycle and then find where motion
energy is spent by bicycle to restore vertical alignment. If bicycle will
unable to restore vertical alignment then it is not stable.
| {
"pile_set_name": "HackerNews"
} |
Dream It. Code It. Win It. competition launches - tradingscreen
http://www.tradingscreen.com/index.php/careers/mit-stem-ny-creative-code-competition
======
tradingscreen
The “Dream it. Code it. Win it.” competition has been launched by the MIT Club
of NY, MIT EF NY, and TradingScreen to celebrate and reward the creative
aspects of computer science in New York City.
The selection of semi-finalists will be done over a period of three months
through social media voting and a committee of volunteers. The contest will be
limited to full-time students who are 18 years or older at the time of
submission and enrolled at an accredited institution. At the final, live event
in New York City, the selected semi-finalists will present their creations
before a panel of successful computer scientists, entrepreneurs and investors
who will award the finalists cash and prizes which total over $30,000.
| {
"pile_set_name": "HackerNews"
} |
Show HN: I wrote “Cracking the UX interview” v0.1 - artiparty
https://productdesigninterview.com
======
artiparty
Hello HN!
I’m happy to share my book with you. It’s goal is to help UX/product designers
to prepare for job interviews, practice their skills and build designers
hiring process at their companies.
I believe there is not enough resources for designers to prepare for job
interviews (unlike for engineers/PMs). Often designers don’t know what to
expect from the interview. On the other side, many businesses don’t know how
to evaluate designers efficiently, especially their first design hire.
I decided to share my experience of hiring designers at WeWork and my first
step was sharing publicly the exercises we used during interviews
([https://blog.prototypr.io/product-design-exercises-we-use-
at...](https://blog.prototypr.io/product-design-exercises-we-use-at-wework-
interviews-2ee1f5a57319)). The feedback I received was great. After talking to
both designers and employees I decided to write this book that will help both
sides to improve their skills, create better expectations and eventually build
better products.
You can read more about the story behind the book here:
[https://productdesigninterview.com/story.html](https://productdesigninterview.com/story.html)
Thank you!
------
lozzo
Hi there, I followed the link and saw this on your web site:
How much time will it take me to read this book? The book has 158 pages, so it
should take you about 3-4 hours. I value your time, so I worked hard to keep
the noise-to-signal ratio high to make sure you can finish it over a weekend.
Well, unless I am getting this wrong, I think you mean the opposite. You want
to keep the noise-to-signal ratio low !!!
you see, errors like those, discourage me from going any further with your
book.
that said. Good luck with it.
| {
"pile_set_name": "HackerNews"
} |
Want to create jobs? Import entrepreneurs. - j0ncc
http://money.cnn.com/2009/12/14/smallbusiness/entrepreneur_visa/index.htm
======
Raphael
No, foster a climate where current citizens have incentive to start up.
~~~
pg
The two aren't mutually exclusive; you can do both.
| {
"pile_set_name": "HackerNews"
} |
Show HN: Feature flag management for feature lifecycle control - eharbaugh
https://launchdarkly.com/
======
eharbaugh
Hi, I’m Edith Harbaugh, CEO & co-founder of LaunchDarkly. Feature
flagging/toggling is a best practice of continuous delivery. Feature flagging
is easy - you can do it with a config file or a database field. However, once
you start feature flagging at scale, your feature flag config files can become
a junk drawer & a form of technical debt.
LaunchDarkly is a feature flag management system that allows you to scale up.
You get an intuitive UI that non-technical users can control feature flags
with - for example, a Product Manager can turn on functionality. We also give
you access level controls and audit logging - allowing you to control who can
change which feature flag and get visibility on these changes. As well, we
offer SDKs for multiple languages so you can use feature flags consistently,
across mobile and web. When you’re ready to use feature flags at scale, we
hope you’ll use LaunchDarkly.
Would love your feedback about how you’re using feature flags already, and
what you’d expect to see from a feature flag management system.
~~~
vinodkd
So how exactly does launchdarkly help with feature flags becoming a junk
drawer/tech debt? I can see how the intuitive ui would help in general, but
what I was really expecting to see was:
* A way to manage dependent feature flags, because god knows once those non-technical(and technical) users have switches they can throw, they're going to ask for/build features that are dependent on each other in subtle ways
* A distinction between a true feature flag and what I call a "release" flag, ie, a feature that's not yet fully built, but that will not be optional once fully built and released, so its flag goes away at some point.
Also, btw, do you have an API for the flag creation itself, not its use, ie a
addFlag()/removeFlag() API?
~~~
jkodumal
We do have some features to help manage the lifecycle of feature flags. For
example, we can determine when a flag has been "flipped on" for everyone, and
notify you that it's time to remove it. We can also determine that a flag has
been removed from your codebase, and prompt you to remove it from LD
([http://blog.launchdarkly.com/launched-flag-status-
indicators...](http://blog.launchdarkly.com/launched-flag-status-
indicators/)).
We have more coming, including an ability to mark "permanent" flags that
should never be removed.
The product is built API-first-- everything in our UI is driven via our own
REST API. Docs are here:
[http://apidocs.launchdarkly.com/](http://apidocs.launchdarkly.com/)
~~~
vinodkd
Thanks for the response. This is the confusing part for me: I've been talking
in terms of Feature toggles as described by Martin Fowler:
[http://martinfowler.com/articles/feature-
toggles.html](http://martinfowler.com/articles/feature-toggles.html), but it
looks like LD is focusing on the AB testing piece primarily, with some parts
of the base toggle functionality included by default.
Can there be a feature switch that is configured not based on users, but on
owner/admin's choice alone?
More importantly, how do the LD SDKs help with the issues mentioned in the
"Implementation Techniques" section of Fowler's essay - things like avoiding
conditionals, DI, etc?
~~~
jkodumal
Our goal is to build a developer-focused platform that gives teams the ability
to adopt feature flags as a core part of their dev cycle-- including pretty
much all of the use cases described in Fowler's article (ops, permissioning,
etc.). Many of our customers are not using us for A/B testing at all.
re: conditionals, DI: our SDKs focus on the base technique (flags based on
conditionals), but it's possible to wrap that core with higher-level
approaches like DI, etc. We're considering going down that path, but where it
makes sense to do so, and not have to re-implement higher-level wrappers for
every framework out there. So, e.g.-- in Ruby, I can see us providing higher-
level APIs specific to Rails.
Dependent feature flags are also an interesting problem-- we don't have a
solution to that yet, but we do spend a lot of time thinking about feature
flags, and I hope we'll have something on that front soon :)
------
tldnr
Seems like a great product! Our inhouse flagging system is painful to use when
applying features en-masse, and makes "fast" rollbacks virtually impossible.
That, and other reasons, make this seem like a good alternative.
What's the flow like for a developer creating their own environment? I'm a
little concerned that asking developers to visit a third party site/be
connected to the internet to use feature flags locally be pretty inconvenient,
especially if the developer is working offline. Would you consider adding a
simple server implementation that could run locally and do feature flagging on
a per user basis (rather than rolling out to percentages, to prevent misuse)?
~~~
pkaeding
For local development, you can create an 'environment' for each dev, which
will maintain a separate set of targeting/rollout rules for each feature. This
will make development/testing fast when you _are_ connected to the Internet.
You can flip the toggle value on the LaunchDarkly dashboard, and it will be
instantly updated the next time your code gets the feature flag.
You can also provide a default value that will be used if the network is not
available. You can read from a config file to drive this if you need to, to
allow devs to specify the flag value when offline. This is what we do for our
dogfood environment (prod, staging, and dev environments all talk to dogfood,
but dogfood doesn't have another LaunchDarkly to talk to).
Hope that helps!
------
swsieber
Meta question: Does launch darkly use launch darkly for new features? Or is it
mostly stable or branching out into new areas (like the Android support)?
~~~
jkodumal
Of course-- we have a dogfood instance of LaunchDarkly. We use LD as our plan
permissioning system (the plans you see on our pricing page are managed by
LD). We also use it for ops-- we migrated a key piece of our analytics
infrastructure over to DynamoDB, for example, and controlled the rollout of
that via a feature flag.
And of course, we'll launch new features behind a flag. I'll admit that we've
had two or three occasions where we had to hit the kill switch after a deploy,
so I'm rather glad we had a LaunchDarkly for our LaunchDarkly.
[edit] here's a shot of our dogfood instance:
[https://twitter.com/jkodumal/status/717786744750477312](https://twitter.com/jkodumal/status/717786744750477312)
------
misterkwon
can't you just use an open source library for this? how many feature flags
would you even need for this to be worth it?
~~~
eharbaugh
Good question - teams should do what’s best for them based on their resources
and scope of feature flagging needs. Though open source libraries exist,
they're per language (Ruby, Python). We offer support for all major languages
+ iOS. As well, we have environment support and access control levels to use
feature flags effectively throughout development. You can read more here:
[http://blog.launchdarkly.com/enterprise-requirements-for-
man...](http://blog.launchdarkly.com/enterprise-requirements-for-managing-
feature-flags/)
~~~
omegaworks
Any hope for Android support?
~~~
eharbaugh
Yes, we are planning on adding Android this summer. Are you interested in
being a beta user? Write us at support at launchdarkly.com
------
amituuush
do you support single page apps?
~~~
apucacao
Yes we do. Checkout the documentation for front-end flags at
[http://docs.launchdarkly.com/docs/front-end-
flags](http://docs.launchdarkly.com/docs/front-end-flags).
Let us know if you have any questions.
Disclaimer: I'm a front-end developer at LaunchDarkly.
------
vishalsankhla
Looks awesome! How long does it take to integrate?
~~~
pkaeding
It is really quick to get started. You just drop our SDK into your project,
and add a few lines to your code around where you want to toggle. Our docs [1]
give examples in each language that we support.
[1]: [http://docs.launchdarkly.com/](http://docs.launchdarkly.com/)
nb: I'm an engineer at LaunchDarkly
------
kzhahou
In the spirit of the HN community feel, "Show HN" has always been about
personal, side, or small projects. Never a link to a company's page.
~~~
dang
Oh no, that's mistaken. Show HN is for something you've made that people can
try out. The project can be, and often has been, a company with customers.
~~~
kzhahou
Weird, I don't think I'd seen a company's landing page as a Show HN before.
Though the thing I'm picking up might be more the fact that this company
didn't just launch today. I.e., they're not showing us something new today.
Their product has been out for, what, at least a year? So if "Show HN" is ok
here, then really any company's social media person can just repeatedly post
links to themselves...
~~~
dang
No, because (a) it has to be posted by the creator and (b) they can't do it
repeatedly.
| {
"pile_set_name": "HackerNews"
} |
Is there a good English wordlist with common words, for free download? - pramodbiligiri
We are doing entity extraction for documents specific to a domain. Unfortunately our domain specific index contains many common English words, and we would like to take them out or weigh them much lower.<p>I'm trying to choose between WordNet, Google Ngrams (too big!), and Moby Wordlist from Sheffield University. Any suggestions?
======
bediger4000
Look at the file named "eign" in the GNU troff distribution. I use it as a
"stop word" list and it seems to work pretty well.
~~~
pramodbiligiri
Oh, had never heard of this. This looks good for extremely common short words.
The version I have has only 133 words though. I'm looking for something in the
range of a few thousand words at least.
------
mindcrime
How about:
[http://jmlr.csail.mit.edu/papers/volume5/lewis04a/a11-smart-...](http://jmlr.csail.mit.edu/papers/volume5/lewis04a/a11-smart-
stop-list/english.stop)
Also, see previous discussion at:
[http://stackoverflow.com/questions/1218335/stop-words-
list-f...](http://stackoverflow.com/questions/1218335/stop-words-list-for-
english)
| {
"pile_set_name": "HackerNews"
} |
Mark Andreesen on How to Kill the Stock Market - AndrewKemendo
http://www.businessinsider.com/andreesen-on-how-to-kill-stock-market-2014-3#!CcuNq
======
lexcorvus
Both first and last names are misspelled in the title:
wrong: Mark Andreesen
right: Marc Andreessen
| {
"pile_set_name": "HackerNews"
} |
Convergence: complete HTML5 game - DanielRibeiro
http://www.currantcat.com/convergence/
======
gammabeam
Hello there, I'm the one who made The Convergence! Thanks for viewing my game,
I hope everyone enjoyed it!
I didn't have a lot of time to finish it, but I wanted to add the sounds a
friend of mine did, as well as a logo for the awesome CONSTRUCT the devs
mentioned here.
Me & my girl are amazed by the amount of people that played and shared the
game - it's nice to see such positive feelings spreading all around!
I want to polish the game a bit more, I fell like the mechanics deserve it! :D
Thanks for making our day even more special!
~~~
tagawa
You deserve the attention! It's a great game and refreshingly original. I've
got several colleagues playing it now. Looking forward to the planned
improvements...
~~~
gammabeam
Thanks!
------
TomGullen
Hi guys! I'm Tom from Scirra and we made Construct 2
(<http://www.scirra.com>).
This is a great game made with our engine, the author has also made it
available on the Chrome Web Store:
[https://chrome.google.com/webstore/detail/lkiiendkaiacnmggpp...](https://chrome.google.com/webstore/detail/lkiiendkaiacnmggppdckogcgmjaoapf)
This game of course deserves it's own post as it's greatly executed, and
without meaning to steal it's thunder we would love to show you a couple more
excellent games made with Construct 2!
Can't Turn it Off <http://www.scirra.com/construct2/demos/cant-turn-it-off>
Trashoid Attack <http://www.scirra.com/construct2/demos/trashoid-attack>
We also ran a small competition on our website and there's a bunch more games
to look at here: [http://www.scirra.com/forum/users-choice-final-
poll_topic458...](http://www.scirra.com/forum/users-choice-final-
poll_topic45844.html)
Anyway it's really promising what is coming out of Construct 2 now, me and my
brother (just two of us running Scirra) love playing games like these that get
made! We hope to see lots more :D
All games made in Construct 2 are all pure HTML5, not a whiff of Flash in
sight!
~~~
johnyzee
> _not a whiff of Flash in sight!_
How do you handle sound? HTML5 sound is still very insufficient for games at
the moment in my experience.
~~~
TomGullen
This game doesn't have sound, but Construct 2 does allow for it. We've
actually blogged about HTML5 sound quite a lot:
[http://www.scirra.com/blog/46/more-on-html5-audio-codecs-
and...](http://www.scirra.com/blog/46/more-on-html5-audio-codecs-and-politics)
It's a bit of a minefield and is difficult to manage. MP3 has some strict
licensing rules in regards to distributing games so we made a point of
avoiding it. What we do is dual encode every sound to .ogg and .m4a, this will
cover all browsers/devices. It's not the best solution space wise but it's
safe and works pretty well.
We hope browsers make significant upgrades to HTML5 audio at some point! The
easy option is to use Flash to handle sounds but we think this is cheating,
we're kinda HTML5 purists :)
------
phoboslab
Shameless plug: my HTML5 Game Engine <http://impactjs.com/> is a perfect fit
for games like these.
This particular game seems to be made with Construct2 (
<http://www.scirra.com/construct2> ) though - which is a bit more "point-and-
clicky" than Impact.
~~~
TomGullen
Construct 2 is a very powerful engine, and a lot of great games are being made
in it! Describing it as 'point-and-clicky' really doesn't justify it at all.
~~~
phoboslab
I didn't mean "point-and-clicky" in a negative way; sorry if it sounded like
it.
Construct 2 describes itself as "A visual HTML5 game development tool", Impact
however is a programming framework. Someone without programming experience
wouldn't be able to do much with Impact, but may still be able to make great
games with Construct 2.
We're focusing on different target groups and in my book that's great. It's
what HTML5 needs to succeed :)
~~~
AshleysBrain
Yeah, hopefully between us we can silence the Flash critics who claim there
aren't any good tools available :)
------
joeyh
Cute.
Having the arrow keys move the woman in reverse puts an odd male perspective
on it though. This could be fixed by making the arrow keys move in the
direction of whichever character is currently right-side-up.
Edit: Having the viewport follow the guy also contributes to this on scrolling
levels, and I don't know how to fix it, aside from just having a character
select at start.
~~~
fiblye
This is the biggest non-issue in any game ever. I can't even imagine how
anybody's first thought would be "how DARE the man be directly controlled by
the arrow keys instead of the woman." I've made games for my girlfriend
several times, with some games having her control me and others controlling
herself, and neither time did we think what character was being controlled was
an issue.
Hell, I actually read this post to her, and she was just completely confused
and annoyed that someone found this a problem.
There's no sexism to be found here. The male just happens to appear on-screen
first and that's who you control. I'd struggle to find a single woman that
would've even noticed that.
~~~
true_religion
> I'd struggle to find a single woman that would've even noticed that.
This is kind of ridiculous. Clearly, people are noticing and some of the
people who notice might be women.
I noticed because I assumed which ever character was right-side up would be
controlled by the arrow keys. I didn't think it was sexist. I thought it was a
flaw in the game.
~~~
fiblye
People tend to focus on characters instead of directions. Since the male was
the first person I controlled, it was natural that I kept on using the arrow
keys to control him and I didn't even take a second to thing about it. Nobody
would expect to be moving right, hit the flip button, and then start moving
left just to "prove the developer treats genders equally." The control scheme
would be disorienting and far more people would complain about that. And if
you suggest the player select their gender at the start, then puzzles would
have to be adjusted to suit this. It's needless effort to appease people that
the game wasn't even made for.
And judging by the name, I'm assuming joeyh is a male, and the only other
person to be "bothered" by it is also a male (dgreensp), so my claim isn't
exactly baseless.
Besides, this was a game made by a couple. There's outrage where their
shouldn't be any. This guy's partner doesn't need to be "liberated from male-
dominated control schemes."
~~~
blasdel
You don't understand at all — the whole point is that you're concurrently
controlling both characters, but then for some reason it's always from the
player's perspective of the male. It's a bug.
The game starts with the male on the bottom, I press right, the and both
characters move to their right. When you flip the male to the top, both
characters move to their left when you press right. I expect my keyboard
movements to directly correspond to both characters from their own
perspective.
------
fiblye
Here's a preview of the HTML5 game I've been slowly working on for a year:
<http://ektomarch.com/games/>
I already have 50 levels made, most of which are the size of large Super
Metroid rooms, and ~60 enemy types. I'm just overwhelmed with work lately and
haven't been able to make much progress.
~~~
city41
I remember seeing this a while back and looking forward to it. Is it playable
anywhere?
------
taylorfausak
This is cool! Reminds me of VVVVVV (<http://thelettervsixtim.es/>), which had
a similar section where you controlled two characters at once. It's a little
strange that the secondary character moves even if the primary one is running
into a wall, but it looks like the puzzles require that.
~~~
arckiearc
It's a lot like Binary Land, a game from the mid 80s:
<http://youtu.be/NLI415emLzQ?t=42s>
------
micheljansen
It's actually quite fun to play. I think this is the first time I have said
this about a HTML5 game/tech demo :)
~~~
phoboslab
Good gameplay is not a property of the platform (e.g. HTML5 or Flash), but of
the game itself. Since HTML5 is still very young, there are mostly web devs
building games with it - which often results in games that "feel" wrong.
But I agree, this game is a very nice example of how to do it right.
Also try <http://www.phoboslab.org/ztype/> or <http://playbiolab.com/> :)
------
ahrjay
Looks good, I did notice that the appcache has the wrong mime type it's
text/plain and needs to be text/cache-manifest.
~~~
AshleysBrain
Scirra dev here. This is indeed a problem. The thing is Chrome is ultra-strict
about the MIME type and rejects it if it's wrong. Firefox seems happy with it.
Given HTML5 is new and nobody has their servers set up with this MIME type,
and lots of people are already making HTML5 games with our tool, I'm tempted
to say it's Chrome's fault for being too strict... getting everyone to
reconfigure their servers is a real PITA just because Chrome is fussy.
~~~
ahrjay
It's not being "ultra-strict" it's a requirement of the spec[1] that the file
be served with the correct mimetype.
[Edit] It doesn't work in firefox, the attribute triggers the permission bar
but because it's the wrong mimetype it doesn't actually store anything.
Check it in Tools > Options > Advanced > Network
You'll notice that it has the domain referenced but has 0 bytes stored.
[1] [http://www.whatwg.org/specs/web-apps/current-
work/multipage/...](http://www.whatwg.org/specs/web-apps/current-
work/multipage/offline.html)
~~~
AshleysBrain
Ah - well I guess the spec is written with the future in mind when everyone's
got their servers configured correctly. Until then nobody has their servers
set up right so I think it would be practical for browsers to be a bit more
relaxed. Right now we're seeing a lot of games on servers without this MIME
type, and the browsers respond by requesting all files from the server all
over again every single page reload, probably wasting a lot of bandwidth and
making HTML5 games look bad. I think there's a good case for a bit of
relaxation on the browser side - standards compliance is good, but not if the
web isn't ready yet.
~~~
Pewpewarrows
No... just no. If browsers are relaxed now, then they are forced to provide
backwards-compatibility for all the sites that then rely on those relaxed
standards. Why do you think Internet Explorer is a complete mess and have dug
themselves into a whole of having to ship half a dozen "compatibility modes"
with every new version of their browser? Because they didn't stick to
standards.
------
atacrawl
This is really well done, and would translate _very_ well to a touch
interface.
------
headbiznatch
Fun little game where when the twist became opaque, I immediately was excited
to play more levels. That's a game design win. Thanks for sharing.
------
websymphony
Cool concept and fun to play. Good job.
------
nate
That is awesome. Nice work.
------
RyanMcGreal
Charming concept, executed with great skill.
Note: flickers madly on Firefox 7.0.1/Windows XP, but works smoothly on Chrome
15.0.874.106 m/Windows XP.
------
ByteMuse
This is great. The level design is fantastic - a fair difficulty and fun
progression.
------
terrapinbear
How is Convergence a "complete" HTML 5 game without sound?
------
tomrod
Does anyone else think the guy looks a little like Hitler? Perhaps a different
nose...
Just saying. It's hard to tell what exactly I see with 8bit sprites! :-)
~~~
vshade
He looks like a lumberjack
| {
"pile_set_name": "HackerNews"
} |
Software should be designed to last - kiyanwang
https://adlrocha.substack.com/p/adlrocha-software-should-be-designed
======
d_burfoot
I don't think anyone who is serious about software craftsmanship would dispute
the basic idea here ("you should be tasteful and minimalist when choosing
dependencies").
The problem is that the economics of software development don't really ever
require craftsmanship. There are two basic modes of development. 1 - You are a
startup, struggling to survive; you have to ship features as fast as possible,
with many fewer engineers than you really need to do things right. 2 - You are
a huge billion dollar corporation, probably a monopoly, with an enormous moat
protecting you from serious competition.
Neither situation prizes craftsmanship. The startup just needs to glue
together a hack to raise the next founding round or solve the immediate user
problem. The gigacorp has more money than it knows what to do with, so they
"solve" every software problem by hiring more engineers. There's actually not
many companies who exist in the middle ground, where it might be important to
produce high quality software. I believe this is also the reason we don't see
more uptake of high concept languages like Haskell and Lisp.
~~~
forgotmypw17
I'm writing something designed to last, and here is how I am doing it:
1) I am doing it by myself, without any corporate sponsors or financial pay.
2) I am writing it in such a way that it could have already been working for
25 years.
~~~
thundergolfer
I like that 2nd criteria. Seems like a smart way to avoid components that are
much more likely to get software decay.
Have you been programming for 25 years?
~~~
tluyben2
> Have you been programming for 25 years?
I have longer than that and it is really scary that software to see that
software I wrote early 90s is still used. I thought, at the time, it would be
rewritten in something cooler every few years. But nope. It does make me pick
tech that can or will last; not many dependencies, open source, jvm, .net core
or C/C++, so it is possible to revisit it in 20 years and not be completely
lost (like I imagine it would be with js frameworks/libs/deps at the moment).
------
stupidcar
What this article really boils down to is the usual coder arrogance: Thinking
that most problems are inherently simple[1], thinking they are a better coder
than average, and thinking that the answer is always to do more yourself, so
that more of the codebase will benefit from your superior skills.
The truth is, there are no easy problems in programming. There are always a
thousand corner cases and unexpected complexities. And all those buggy,
unmaintained libraries you find in your language's package repository are
still probably better than anything you're going to roll yourself. Look into
the source code of such libraries and alongside the hacks and outdated code,
you'll find hundreds of subtle problems avoided, because the author spent a
long time and effort on the problem you've spent ten minutes thinking about.
The answer to quality problems in software engineering is not less libraries,
it's more and better libraries. In the same way that the high-quality civil
engineering the author admires is not achieved by having one person do
everything themselves, but by the successful collaboration of a large number
of skilled and specialised subcontractors.
Right now, we're still in the infancy of our industry. This, combined with
continued the evolution of hardware capabilities has meant that software
engineering has never had the long-term, stable base on which to build
repeatable, reliable routes to success. That's fine. If we're in the same
position in a couple of centuries, it's a problem, but right now we should be
experimenting and failing.
Even today, the situation is not as bleak as if often portrayed in articles
like this. There are a huge number of robust and high-quality libraries and
frameworks providing extremely complex capabilities in an easily reusable way.
If a programmer from the mid-nineties were to be transported to the present, I
suspect they'd be amazed by what off-the-shelf libraries enable even a single
programmer to do. The undeniable quality problems that do affect much modern
software often have more to do with the hugely increased scope of ambition in
what we expect software to do for us.
[1] Outside a handful, like encryption, that are acknowledged as hard for the
purpose of being the exception that proves the rule.
~~~
tomlagier
> The answer to quality problems in software engineering is not less
> libraries, it's more and better libraries. In the same way that the high-
> quality civil engineering the author admires is not achieved by having one
> person do everything themselves, but by the successful collaboration of a
> large number of skilled and specialised subcontractors.
I think one of the fundamental problems with our industry is that there is no
funding mechanism for those subcontractors. The "base" that we build on is
either unpaid volunteers, or tailored to a specific companies needs and
goodwill.
If free software was not the norm, I think we'd see a lot of investment and
enterprise around producing high quality libraries.
~~~
anchpop
I recently wrote about this problem here [1]. I think "free" software should
still be the norm, but companies who use it should have to pay into a fund
that distributes the money to where it can be used most productively to make
new and more useful libraries. Otherwise there's no sustainable way for most
people to contribute to open-source and be paid for it.
[1]: [https://blog.andrepopovitch.com/complement-collution-
paradox...](https://blog.andrepopovitch.com/complement-collution-paradox/)
------
GnarfGnarf
I would like to answer the question "Why is software so frequently
disappointing and flawed? "
As a developer, there is nothing I would like better than to turn out the
highest quality software, as long as I could be compensated for it.
Does anyone really think that the millions of lines of code that went into
Windows is worth the piddling $100+ the consumer pays for it?
Let's start with a basic premise: quality is worth money. I'm sure we agree
that a Hyundai (or whatever passes for a cheap car in your neighbourhood)
costs less than a Porsche or BMW because the Porsche is better designed and
better built. Better design means more experienced and brilliant engineers,
more talented designers; better built means more skilled and dedicated
assembly-line workers. All these people demand more money. Hence the Porsche
company commands a higher price for their Carreras, 911s, whatever.
The same should be true of software. If a company invests great care in
designing a better operating system, or a better word processor, that never
crash, and always have helpful help and meaningful error messages, how much do
you think that would be worth? I'll give you a hint: the military do in fact
get top-quality software for their jets and rockets. They get software that
almost never fails, and does exactly what it is designed to do.
Do you know how much this software costs? $50 per line of code.
Translating into everyday terms, a bullet-proof operating system would cost
you, at a rough guess, $5,000 per copy.
Now I have no doubt some people would be happy to pay $5,000 for a stable OS.
However, there are many people who couldn't afford this amount.
So what would happen? In any other field (automobiles, stereos, TVs,
restaurant meals, housing) people who can't afford quality just put up with
less and shut up.
But in the software field... well, they just make a copy of someone else's
software, and enjoy the full benefit of top-of-the-line quality, without
paying for it. I wager even you couldn't resist obtaining a $5,000 OS for
free.
How long do you think a software company would last if their product cost
millions to make, and they only sold a few copies at $5,000? Why they would go
broke, of course.
This is the crux of the software dilemma: except in a few specialized cases
(commercial or embedded software), the maximum price for software is the
monetary equivalent of the nuisance value of duplicating it.
In consumer software, this is in the range of $19-$29.
The digital world turns the economics of quality upside-down: in traditional
models where quality is supported by price, the market pays the price if it
wants the quality.
In the digital model, a perfect copy of the merchandise costs virtually
nothing, and undercuts the legitimate market, putting a cap on the maximum
that can be charged for a product.
There is a built-in limit to how much time, effort and expense a company can
invest into a mass-produced product. This cap is equivalent to the "nuisance
value" defined above. It is not reasonable for the consumer to expect
warranties and liabilities that go way beyond what the manufacturer receives
from sales of the product.
The music and movie industry are wrestling with the consequences of easy
digital duplication. They have taken a different route to protecting their
intellectual property.
I challenge anyone to come up with a business model where the software
developer that invests great expense in building a quality product, can obtain
full compensation from the market segment that values his quality.
Whose fault is it anyway? Simple: it's the consumer who copies and pirates
software that forces the price down and therefore the quality to remain low.
Any analysis that does not take this into account is simplistic.
It is naïve to think that most developers are not struggling to make ends meet
and stay in business (Microsoft notwithstanding).
~~~
justinclift
What are your thoughts on where OSS fits into this?
~~~
GnarfGnarf
[https://progenygenealogy.blogspot.com/2018/01/the-open-
sourc...](https://progenygenealogy.blogspot.com/2018/01/the-open-source-
mirage.html)
------
SigmundA
I want to build software that lasts, but I don't think you really can at this
point.
We haven't even figured out fully how we should represent a string or date or
number in software let alone have an enduring language or ABI.
I feel like I am building everything on a foundation of sand at it will need
to be rebuilt every 5 years, 10 if you're lucky.
I do think it will change, but it will be awhile and by then it will probably
just be a few big players making software anyway, kinda like car companies.
~~~
cmroanirgo
I've got the same exe running in client installs since the late 90's. It's had
a tweak or two here and there, but largely unchanged. I last recompiled it
about 2yrs ago and had a fresh install of it about 6 months ago.
It's not the prettiest thing and could have certain upgrades in light of new
security practices (eg use new encryption algos & go thru an in-depth security
audit). But to the clients, it's unbeatable and there's nothing remotely close
to migrate to.
Unbelievably, it's a vb6 app that I never bothered porting to dotnet. Even
more unbelievably, it's a port of its predecessor, written in turbo pascal. As
long as I can continue to find the dev environment installer, it's good to go.
So. I think s large part of the problem is that ppl don't redirect their dev
environment enough to keep an install disk, (if one was available in the first
place).
A second large problem is that modern development relies heavily on 3rd party
library use, which means your software is reliant on more than one company for
your binary.
So. Find an environment that you can archive/keep, and ignore the not invented
here rule to a large extent
~~~
user5994461
>>> I've got the same exe running in client installs since the late 90's.
That's on Windows that has stable API and ABI. The 90's is probably the limit
because older software are 16 bits and don't work on current 64 bits Windows.
On Linux all software break with each distro release because "core" libraries
are unstable. Got executable and libraries compiled on RHEL 6 and most of them
fail to load on RHEL 7 because something.so not found.
~~~
laumars
You can bundle the .so files in Linux just like you would bundle .dll files in
Windows. Or statically compile if you don’t want dynamic libraries. Linux and
Windows isn’t really any different there aside from Windows has more in the
way of novice friendly tools to create redistributables.
It’s also worth noting that you’re comparing Apples to Oranges in that Visual
Basic 6 is a very different language to C++. VB6 has its own warts when it
comes to archiving such as it’s dependence on OCX and how they require
registering for use (they can’t just exist in the file system like DLL and SO
libraries, OCX required their UUIDs loaded into Windows Registry first).
To further my previous point, if you wanted to use another language on Linux,
maybe one that targets the OS ABIs directly (eg Go), then you might find it
would live longer without needing recompiling. Contrary to your statement
about user space libraries, Linux ABIs don’t break often. Or you could use a
JIT language like Perl or Python. Granted you are then introducing a
dependency (their runtime environment being available on the target machine)
but modern Perl 5 is still backwards compatible with earlier versions of Perl
5 released in the 90s (same timescale as the VB6 example you’d given except
Perl is still maintained where as VB6 is not).
~~~
PaulDavisThe1st
_Linux_ ABIs might not break often.
But GTK, Qt, libwhatever ? More often than one would like.
~~~
laumars
I’d discussed that problem in the first paragraph of my post.
~~~
user5994461
For reference, the C++ ABI is changing with every minor version of gcc, making
compiled libraries like glibc or Qt incompatible. Major distro releases
upgrade gcc so all packages are incompatible.
I agree on compiling statically to avoid DLL hell. However it is fairly
difficult in practice because software rarely document how to statically build
them and they very often take dependency to some libraries on the system. All
it takes is one dynamic dependency to break (libstdc++ is not stable for
example).
~~~
laumars
> _For reference, the C++ ABI is changing with every minor version of gcc,
> making compiled libraries like glibc or Qt incompatible. Major distro
> releases upgrade gcc so all packages are incompatible._
It’s actually not as dramatic as that and you can still ship libc as a
dependency of your project like I described if you really had to. It’s vaguely
equivalent in that regard to a Docker container or chroot except you’re not
sandboxing the applications running directory.
This is something I’ve personally done many times on both Linux and a some
UNIXes too (because I’ve had a binary but for various different reasons didn’t
have access to the source or build tools).
I’ve even run Linux ELFs on FreeBSD using a series of hacks, one of them being
the above.
Back before Docker and treating servers like cattle were a thing, us sysadmins
would often have some highly creative solutions running on our pet servers.
> _I agree on compiling statically to avoid DLL hell. However it is fairly
> difficult in practice because software rarely document how to statically
> build them and they very often take dependency to some libraries on the
> system. All it takes is one dynamic dependency to break (libstdc++ is not
> stable for example)._
There are a couple of commonly used flags but usually reading through the
Makefile or configure.sh would give the game away. It has been a while since
my build pipelines required me to build the world from source but I don’t
recall running into any issues I couldn’t resolve back when I did need to
commonly compile stuff from source.
------
hyko
The idea that removing external dependencies necessarily simplifies your code
and leads to a maintainable system is a fallacy. Use external dependencies
appropriately, follow SOLID principles and a clean architecture and you can
leverage the benefits of other people’s code without losing the
intelligibility of your system.
In the example the author gives, the real mistake is allowing the web
framework to define a system that you then try to customise to produce your
application. The battle is already lost because your business logic is now a
dependency of someone else’s (rotting) generic web app template. This may be
appropriate for you if this is a proof of concept or spike but for something
with a longer lifetime, any time you save will be paid back with interest when
the host framework diverges from your needs.
A JSON parser is not an equivalent class of dependency _if_ you keep it at the
periphery of your system. There’s not much point in writing a JSON parser
unless you have a particular requirement that cannot be satisfied by a third
party library. You should be able to swap it out for a different JSON parser
in a few hours.
80:20 seems like a ratio that was just pulled out of the air; I have no data
to hand either but for most user applications I would suspect the real ratio
is more like 99:1.
Edited to add: I’m still in agreement with a lot of what the author says here,
such as: evaluate your dependencies seriously, understand them, and be mindful
of their impact on things like binary size.
------
zokier
For truly long lasting software, I feel all code that make up a system should
be treated equally; i.e. not divided into "own code" and "third party
dependencies". Basically extreme form of dependency vendoring.
But it is very expensive way of doing things, and would not work well with
modern constantly updating libraries. But I do personally consider software
maintenance to be an antipattern. I'd rather have my software be correct and
eventually replaceable than constantly changing and "maintainable"
------
msla
In order to have software that lasts, you need environments that last.
The classic 1950s Big Iron software, now run swaddled in layer upon layer of
emulation on current mainframes, is software which lasts because people will
recreate its environment again and again, and ignore the people who are ill-
served by it because they don't fit into that environment neatly. Oh, your
name doesn't fit into an all-caps fixed-width EBCDIC form? Go fold, spindle,
and/or mutilate yourself, NEXT! (This happens to me. Over and over again.)
On the opposite extreme, unmaintained Internet software rots:
[https://utcc.utoronto.ca/~cks/space/blog/tech/InternetSoftwa...](https://utcc.utoronto.ca/~cks/space/blog/tech/InternetSoftwareDecay)
Or, to be more precise, software is built with assumptions underpinning it,
and those assumptions change out from under it more quickly on the Internet
than in other contexts. Software can go from being secure and completely above
reproach to being a major factor in DDoS amplification or spam runs because
the world changed around it. Software that lasts is like ships that last:
Replace the hull, the mast, the sails, the cabins... same ship, neh?
------
_def
It's very sad for me to see that software in consumer products isn't that
reliable or performant anymore, just because vendors can "push updates", which
seems to result in crappier software overall.
~~~
crazygringo
Why do you think consumer products at a similar price point used to be more
reliable or performant in the past? I don't think that's true at all. In the
90's, software was riddled with bugs, you saved files every five minutes
because your word processor regularly crashed, you had to restart your
computer multiple times a day when the OS hanged, and starting Photoshop
easily took a full minute.
Also why do you think the ability to push updates results in crappier
software? In the 90's, if your copy of CorelDraw or Windows had a bug, you
lived with that bug for years. Today, if it's a common showstopper, it gets
fixed quickly.
To my eyes, everything's gotten _far_ better.
~~~
Nextgrid
I don't think he's talking about the 90s. But the late 2000s were definitely
better than now. Computers were powerful enough to do most things we do today,
and yet both OSes and applications seemed to be much better. Compare the
quality and stability of Windows 7 (or even Vista) and their macOS
counterparts to their nowadays' versions.
~~~
crazygringo
I guess I'm not seeing that either.
Compared to the late 2000's, today's computers have SSD's and retina displays
and play 4K movies. They're _vastly_ faster, with typography you can't see
pixels in.
And OS's and applications are basically the same. My macOS is no less stable
than it was a decade ago. The main difference is that my Mac is far more
_secure_ , so I trust third-party software much more.
I'm just not seeing how the quality of OS's _or_ applications has gone down. I
think what would be more accurate to say is that both OS's and applications
have added more _features_ , and that otherwise quality has remained basically
the same.
Sure, I still have finicky Bluetooth issues today. But I had finicky Bluetooth
issues 10 years ago too. It's certainly no _worse_. But now Bluetooth gives me
AirDrop too.
~~~
Nextgrid
> today's computers have SSD's and retina displays and play 4K movies
And yet, something as basic as a Slack client now requires gigabytes of RAM,
microblogging such as Twitter loads a monstrosity of a webapp that immediately
makes the fans spin up to display 240 characters in the rare case it actually
loads without errors which require to refresh the page. Modern entry-level
laptops have the processing power of a decent machine from a decade ago, and
yet they appear just as slow to do the same computing tasks we did 10 years
ago.
My 2017 Macbook lags and stutters when loading a YouTube video page. YouTube
used to load fine and not stutter in 2009 on a laptop with a third of the RAM
and CPU that I currently have, and yet the task at hand didn't change at all,
it still just needs to display a video player and some text.
Windows 10 broke _start menu search_. Come on, this problem was solved a
decade ago.
Every large website's login flow now involves dozens of redirects through
various domains which can break in all kinds of interesting ways leaving the
user stranded on a blank page in the middle of the flow. I know the reason
behind them (oAuth, OpenID Connect, etc), but as a user I don't care; this is
a major UX downgrade and the industry should've done better.
We've replaced offline-first applications with cloud-first. Nowadays even
something that should work fine offline will shit itself in all kinds of
unexpected ways if the network connection drops or a request fails.
~~~
blackrock
It’s not the software. It’s all the ads that load up in the background, that
tracks you, even when you’re browsing incognito.
~~~
ric2b
Slack doesn't have ads.
------
zmmmmm
The kicker these days is that "standing still" is not enough. Software that
"lasts" has to constantly change, because externalities force it to -
primarily, security.
Whatever library, language or foundation you built on, you can guarantee
whatever version you used is going to stop getting security updates in a
couple of years. And then there will be a whole lot of baked in dependencies
you didn't know about - URLs (like XML schema URIs that were never meant to be
resolved but lots of libraries do), certificates, underlying system libraries
etc.
So designing software "to last" now is designing software to be constantly
updated in a reliable and systematic way. And the interesting thing about that
is from what I can observe, the only way to achieve that currently IS through
simplicity - as few dependencies as possible, those that you do have very well
understood and robust, etc etc. So it sort of comes back to the author's
argument in the end, though maybe through a different route.
~~~
fulafel
It's possible to engineer for this. The mental model: imagine you are
equipping a time travel mission decades into the future, to integrate with the
future internet, and to return home with acquired results.
A TLS client is not possible to future proof like this of course, but you
could have a roster of approaches. For example, try getting your hands on the
latest curl / wget builds (try each with a few different approaches), or the
latest JVM/Python/Powershell & use those TLS APIs.
A fun game would be to try this with tools available 20 years ago, and then
redo it to make a sw time capsule from present day to 20 years into the
future...
~~~
mjevans
A better approach is to have the "TLS" library accept either an already
connected socket (and host-name to validate) OR return something that behaves
like one after dialing and validating an address. That way when the underlying
library is re-linked to something modern it'll still have the same API / ABI
but the underlying security implementation will belong with the updated
library.
~~~
fulafel
That could indeed be an idea, if you can assume the SW can be rebuilt/linked
against API compatible libs, and you rely on the API existing far into the
future. The automatic CLI tool scouring option would allow you to hedge your
bets over several implementations because of looser coupling.
------
marcinzm
As a counter point to the author:
* Half baked in-house implementations of things are often filled with bugs and broken edge cases. Especially when it comes to security and concurrency.
* In a larger code base, you won't understand everything whether it's in-house code or third party code. The in-house code was written by someone who quit 5 years ago and is now a monk in Tibet.
------
humanfromearth
Software should be built to accomplish its purpose. It's not a special
snowflake, it's a tool to do some specific job. To think otherwise is to setup
yourself and your company/team for disaster.
Kernels, compilers, language runtimes, databases? Sure, build them to last.
Web pages that will not be used in 2 weeks? Don't waste your time with "build
to last".
The problem is that some of us think that our software should be build to last
- but in reality it's just some mediocre thing that needs to get in front of
the clients as fast as possible and some bugs are "OK" to live with.
~~~
leonidasv
That would be fair if all those "pages that will not be used in 2 weeks"
actually ended up not being used after just 2 weeks. Instead, they will
probably continue to be used for 2 months or 2 years.
------
ocdtrekkie
I try to use no dependencies beyond my database connection. The one area I
feel unprepared to handle myself. Anything else I try to code a minimally-
viable functionality that meets my needs. As a bonus, I have learned what all
these things do and how they work while fiddling with implementations, and if
I want to customize their functionality in any way, it's generally pretty
easy.
I'd almost always take a pasted tutorial over a robust dependency every time,
the former is more reliable. Any dependency I do take on is something I have
to watch the changes of and project status of like a hawk.
~~~
PetahNZ
That doesnt work in most cases though. You are not going to reimplement jpeg
encoding, mpeg, etc.
~~~
ocdtrekkie
If the language/framework you are using handles it already, you're probably
golden: Your language/framework is functionally the one dependency you always
have, and features implemented in the platform tend to be fairly stable. Pick
a platform that supports the most difficult things core to your application.
I'm most comfortable in .NET or PHP, both which have extremely large long-
lived built-in function sets that generally are non-breaking over a span of
many years. Meanwhile, I generally don't add anything to them via
NuGet/Composer/etc. They both operate almost opposite to the JavaScript world,
where everything is built by random third parties. If it's popular enough a
need, it tends to be added the basic package.
------
rado
Pure HTML with progressive enhancement has worked fine for 30 years and will
continue so for many more. But a good, lasting product is incompatible with
the current "take the money and run" business strategy and analytics for
management. If it works well for so many people, what can we do?
~~~
aboringusername
That's all well and good but go and visit some random websites and see how
many of them depend on sites like unpkg or jsdelivr or something.
Every time you try to load a resource from a domain you're introducing a
_huge_ risk by letting them be the judge, jury and executioner.
Domain gets hacked and injects malicious scripts or collects user data? Domain
fails entirely (no-renew) or in 10 years just stops responding?
A lot of web pages today have built in self-destruct features purely based on
the JS/resources they're loading.
For 0 benefit, as well, since the amount of BW saved in so marginal.
~~~
rado
Yes, security is the most sensible reason we use mega frameworks, because FB
and Google will patch any issues instantly, for free and forever.
------
cable2600
Software is not always designed to last. Take video games that use servers,
the video game company can take down the servers and the video game no longer
works. You'd have to set up your own server to emulate what the video game
company's server did.
All those mobile games that require a server, once the server goes down the
game no longer works.
In the old days you set up Doom as a server on your PC networked to other PCs
for multiplayer. Those can last and be remade for new platforms.
~~~
abj
LAN hosted games are the golden area. Online game servers are a really
interesting example of brittle software.
When the server is shut down, the game breaks and the software _is_ unusable.
This often happens because of the misaligned incentives of the business to not
release the server code. This disappoints both the original developers and the
consumers.
I'd argue that this is not only a software problem, but mostly an problem of
copyright distorting the incentives of publishers.
One could imagine a client hosted game or alternative architecture that
removes the central dependency on the company could be the way to build
software that is built to last to last longer.
In the online game I'm developing - I make sure that every client can also run
as a game server. This way if my servers are shut down players can play
without depending on me.
My decision to remove myself as a dependency directly goes against the
incentives I have as a publisher. I slightly narrow the possible monetization
options for the game.
Mostly I'm trying to explain why developers/publishers decide to make brittle
software, even if I don't agree personally.
------
AYBABTME
More often than not, the code I have to write needs to be written fast, and I
(mostly) won't have time to look at it again once deployed. I've mostly worked
at fast growing companies, so that's been the case everywhere.
What I've learned is that making stuff as simple and stable/low maintenance as
possible is the only way to be sane. I can't afford to deploy code that then
requires me to hold its hand with a team of 3-4 engineers for the upcoming
years. It needs to run itself and be forgettable. And it needs to be written
fast.
This has become much harder in recent years with the switch to things like
Kubernetes. Kubernetes moves so fast and needs constant nurturing, even for
casual users. Running old versions of it is painful and dangerous, so you're
forced to update. And the ecosystem is still so early in its life that all the
paradigms change and flip on their head every year. Odds are that something
you deploy in it today, will need to be looked at in 18 months. And the whole
thing will need a team dedicated to keeping it healthy.
Anyhow, that's my angle on the author's rant. Things need to last, IMO, mostly
because I don't have time to go back to them later.
~~~
vbezhenar
I'm using completely opposite approach. I'm trying to define a strict
requirements to a given task and then I code assuming those requirements. If
any small detail changes, my code will break. So I'm trying hard to break it
as soon as possible (using asserts and similar patterns) with enough
information to understand the reason of this breakage.
So my code definitely requires maintenance in a changing world. But more than
once those breaks actually identified a bug somewhere else and fixing that bug
was essential. If I would try to self-fix wrong data, those kinds of bugs
could be missed. Sometimes I need to fix a code to adapt to a changed format
or something like this. But that's not a big deal, because change is simple
and something like exception stacktrace allows to instantly find a code which
is to be fixed.
------
boznz
If you want your software to last remember that you also have to have a
working and upto date development environment, ie your code may still work
fine but if you cannot compile it to the correct target it is useless.
I found this out the hard way when I had to recompile some fpga firmware for a
$100K piece of equipment which used a programmer not supported by windows, a
chip and code not supported by the current IDE and a binary blob which we did
not have the source code for and would not link, luckily we winged it by
replacing the FPGA with a hardware compatible replacement... We now preserve
the whole development environment on a VM and save it with the source code.
------
magwas
If you write your code properly, you will have tests which break on every
library incompatibility, and a tiny wrapper around your libraries. (If you do
TDD properly, you will have it.) And that's all you need to keep your code up
to date. When upgrading a library, you will of course build and test your code
with it, and that will show you any problems, which you can fix in the library
interface layer before having any chance to go to production.
~~~
Silhouette
_If you write your code properly, you will have tests which break on every
library incompatibility, and a tiny wrapper around your libraries. (If you do
TDD properly, you will have it.)_
In general, I'm not sure it's possible for all of these things to be true.
Many libraries are useful because of their side effects. Depending on what
those side effects are, it may simply not be possible to integration test your
entire system in a way that would satisfy all of the above claims.
The alternatives tend to involve replacing the parts of the code that cause
real side effects with some sort of simulation. However, you're then no longer
fully testing the real code that will be doing the job in production. If
anything about your simulation is inaccurate, your tests may just give you a
false sense of confidence.
All of this assumes you can sensibly write unit tests for whatever the library
does anyway. This also is far from guaranteed. For example, what if the
purpose of the library is to perform some complicated calculation, and you do
not know in advance how to construct a reasonably representative set of inputs
to use in a test suite or what the corresponding correct outputs would be?
------
drol3
I think we can all agree that software should be designed to last :)
My experience tells me that when this fails to happen its usually not
explicitly due to too many dependencies. Rather it is because the developers
have a poor/partial understanding of the problem domain.
External libraries help this problem a bit because it allows developers to
offload tasks onto other more experienced developers. They can (and
occasionally will) misuse those solutions, but that is not the dependencies
faults. On the contrary skilled developers will produce quality solutions with
or without dependencies.
The signal seen (poor software has many dependencies) does not mean that
software with dependencies are poor. It means instead that developers who have
a poor domain understanding but good business skills are disproportionately
successful compared to their peers that have good domain understanding but
poor business skills.
Make of that what you will :)
------
AshamedCaptain
Isn't the inevitable conclusion "you should design your interfaces so that
they last" (and/or choose existing interfaces which look likely to last)
rather than "you should avoid dependencies"?
------
kristopolous
It's all an illusion. Software kinda isn't lasting, some things merely have
competence behind them
Look at the GNU coreutils, things you probably think shouldn't have changed in
a while:
[https://github.com/coreutils/coreutils/tree/master/src](https://github.com/coreutils/coreutils/tree/master/src)
There's actually frequent edits. Legendary disk destroyer was just updated
last month. The difference is the competence
------
TravelPiglet
Building things to last has a really high cost and is usually not worth it.
The vast majority of the buildings and infrastructure humans have constructed
have been torn down, completely rebuilt or abandoned. What remains are usually
extremely expensive monuments to religon, culture or governmental
constructions.
Most software is not worth to preserve the same way most buildings are not
worth to preserve (except for perhaps the aesthetically pleasing facade).
------
leothekim
"...bridges and civil engineering projects are designed to last, why can we
also design software for it to last?"
Some software has lasted for many decades. This isn't necessarily a good
thing. [1]
[1] [https://arstechnica.com/tech-policy/2020/04/ibm-scrambles-
to...](https://arstechnica.com/tech-policy/2020/04/ibm-scrambles-to-find-or-
train-more-cobol-programmers-to-help-states/)
~~~
ryukafalz
On the other hand, there’s lots of software that’s been around for decades
that still works quite well today. Most GNU software for example, and
particularly Emacs.
------
jakuboboza
90 days certs that needs renew based on external service API that can change
says No.
Software in a hardware that is designed to do certain job should indeed last.
Like Telecom towers should work fine "forever".
But software as in APIs and services that constantly has to evolve and be
expanded. Yeah that is a harder sell.
We live in crypto world where each advancement in cputing or ways to solve
could render algorithms "bad" means we have to update and adapt.
------
thdrdt
Software that lasts is software that is a joy to use.
Imho the tech, how good you are at programming is all not very relavant when
the user doesn't like to use your software.
I have written a small CMS for a company which was replaced twice but every
time they ditched it to return to my old simple version.
This is just one example but I think that should be in people's minds when
they create software that lasts: people must enjoy using it.
This can even apply to a command line program.
------
gentleman11
If somebody is a solo developer working on a huge project, it would be lovely
if the software lasted... but the big fear is that it won't get users. Can't
really spare the time to worry about 10 years from now when you are thinking
about how you have to get a regular job if you can't get X users in the next 6
months
------
geofft
Civil engineers and spacecraft designers build their systems to last because
they don't have a choice. You can't deploy fixes to a production bridge ten
times a day.
There might be good reasons to build software to last, but "These other
engineering fields build their things to last" isn't one of them. I have no
doubt that if you _could_ reliably send fixes to bridges and spaceships,
people would do it - and that many lives would have been saved.
My personal opinion is you should design the larger ecosystem around your
software to last and be robust to the replacement of any individual part.
Building on Go's standard library HTTP server would have seemed needlessly
cutting-edge about five years ago. Building on Apache and mod_cgi would be
considered dated today (even though it would work). The stack you should be
using _will_ change. Build your system so that, when change comes, you're not
afraid to write new software.
In practice, this means making good designs to avoid hairballs of complexity,
having deployment pipelines that let you roll out canary versions (or better
yet, send a copy of production traffic against a sandbox environment), keeping
track of who's using APIs internally so you can get them to change, having
pre-push integration testing per the Not Rocket Science Rule so you can be
confident about changes, etc. You should think about your choice of web
framework or JSON library up front, sure, but more importantly, if a
sufficiently better web framework or JSON library comes along tomorrow, you
should be able to switch. As a principle - any time you see something that you
think you'd be afraid to change in the future, don't ship it if you haven't
shipped it yet and figure out how to build the right abstraction around it if
you can.
------
Ericson2314
Reusable abstractions is the foundation of what makes programming productive.
This person does give respect for libp2p, which is good, but clearly they've
never seen a small abstraction that is still immensely valuable.
------
mehh
The software (as in the actual code for an implementation), should be
ephemeral in my opinion.
The data model, protocols and knowledge base (docs, requirements etc) should
be designed to last and be extensible.
------
jonwalch
Constructive criticism if the author is in here, I find multiple bold
sentences per paragraph to be taxing on the eyes.
------
dqpb
> _If all three are in agreement, they carry out the operation. If even a
> single command is in disagreement, the controller carries out the command
> from the processor which had previously been sending the correct commands._
By this definition, wouldn't all three processes be the last ones to send the
correct commands?
~~~
MintelIE
Actually the way it usually works is a voting system, where if one component
disagrees, and two agree, the majority vote is upheld.
------
alliao
for some reason when I saw the headline I went over to check up on grc.com
that guy's obsessiveness into coding everything in assembly gives me comfort
------
andrewstuart
>> Software should be designed to last
Not necessarily true.
If the requirements of the project are to be secure or fast or easy to
maintain or built quickly or have beautiful user experience or be maintainable
or be designed to be robust & failure resistant, then that is what it should
be.
And if the requirements of the project are "the software must be designed to
last", then the software should be designed to be designed to last.
But if it's not a requirement, then no, it does not matter if it lasts.
The point being that blanket statements about what software should be are
missing important context about the purpose of the software and the
constraints under which it was developed and the specified requirements.
~~~
mcavoybn
Couldn't you have just said "Not all software must be designed to last"?
Also, the author said software _should_ be designed to last not that it
_must_.
Lastly, you didn't address the core argument of the article which is that you
should carefully select and manage dependencies. Reading your comment I wonder
if you even read the article..
~~~
andrewstuart
True. somewhat long winded, edited thanks.
------
MintelIE
This is why I like long-established technologies. You can build most anything
with typical POSIX utilities and time-worn libraries. And they will be there
for you in 25 years. Who knows where Rust and Go and Node will be in a quarter
century?
~~~
_pdp_
You can apply the Lindy power law :)
~~~
MintelIE
Looked it up finally, I agree. It's why we have steering wheels instead of
joysticks (except for the people who have a leg-related disability, they've
been driving with joysticks for like 100 years).
------
LoSboccacc
no, software should be designed to work for as long as necessary, because
we're responsible professional and don't want to squander other people money
on infinite perfection
the whole bit is nonsensical included the parallels with civil engineering,
bridges are built with tolerances, maintenance schedule, and a disposal plan
for when maintenance cost become higher than replacing the bridge, because
nothing in this world actually lasts.
heck our internet is built around operating systems that didn't exist 25 years
ago. imagine investing capital and sink opportunity costs in delivering the
perfect vax typing program...
mind, this is not to say that all software should be quick and dirty. but the
conscious engineer knows whether it's building infrastructure to last decades
or a temporary bypass to support some time limited traffic spike.
the important bit is to ask the stakeholder what's more appropriate.
I also want to see people build their own json parsers at acceptable level of
performance, possibly as streaming parsers instead of just gobbling everything
in memory. imagine wasting a month on building that and then having to defend
your choice every time a bug or a malformed input causes silent data
corruption somewhere in the backend.
[http://seriot.ch/parsing_json.php](http://seriot.ch/parsing_json.php)
| {
"pile_set_name": "HackerNews"
} |
How FeeFighters saved startups $50 million a year in credit card fees - g0atbutt
http://thestartupfoundry.com/2011/03/29/how-feefighters-saved-startups-50-million-a-year-in-credit-card-fees/
======
leftnode
FeeFighters definitely has one of the most professional and usable interfaces
I've ever used. It's gorgeous, not overdone, and functional.
Edit: Wow, even more the guy behind TSS Radio started it. Huge supporter of
Howard Stern and Bubba the Love Sponge, very happy to support Fee Fighters
even more now.
------
namunkin
Wow - I've become a huge fan of feefighters over the past couple of months...
This is awesome - great job guys! I had no idea that they already had high
profile customers like stackoverflow and Okcupid
------
blackhabit
FeeFighters was amazing when I was setting up my merchant services! They even
wrote about my company <http://bit.ly/h9sESp>
------
mcdowall
I don't want to be a hater but I can't say I've had the best service, I've
sent 3 emails around my solution needs (one directly to Sheel from a comment
he placed here on HN) but no response, from the looks of the other comments
maybe mines a one off.
~~~
pitdesi
Oh no! I couldn't find the email in question- it's entirely possible that I
deleted it off my phone accidentally. Please re-send and I'll get back to you
tonight
~~~
mcdowall
I will re send, thanks for your help.
------
bquinn
Any word on when it will be available outside the US, particularly in the UK?
~~~
pitdesi
Unfortunately, we don't have any plans to launch in the UK in the short term.
It's too hard to build the competitive marketplace we need to do in each
country. We're launching Canada later this week.
| {
"pile_set_name": "HackerNews"
} |
Plastc may file bankruptcy and will cease operations April 20, 2017 - sgarg26
We Regret to Inform You...<p>For the past 3 years, our mission here at Plastc was to build and deliver the most technically ambitious smart card on the planet. After making enormous leaps in development, product innovation and progress towards our goal, Plastc has exhausted all of its options to raise the money it needs to continue.<p>Plastc, Inc. is exploring options to file Chapter 7 Bankruptcy and will cease operations on April 20, 2017.<p>While we have fallen short of our goal, we are proud of our team and the effort that went into developing a working Plastc Card. However, without the necessary capital to continue, all employees have been let go, which means that Customer Care and Social Media channels are unmanned or have been shut down.<p>How We Got Here:<p>We were expecting to close a $3.5 million Series A funding round on February 28, 2017. There are functioning Plastc Cards, which were demonstrated to our investors and our backers, and the capital was to be allocated for the mass production and shipping of Plastc Cards to pre-order customers. At first, the principal investment group postponed their investment and a couple of weeks later the round fell apart.<p>After the initial funding was unavailable, Plastc made progress with another investor who offered $6.75 million. This deal was scheduled to close last week and would propel development across the finish line, as well as allow for Plastc Card pre-orders to be shipped and for production to continue into a retail phase.<p>However, once again at the very last minute, our investor gave us notice that they have decided rescind their investment offer. The round was a signature away from closing and we were extremely caught off guard when they notified us yesterday they were backing out. Our existing investors kept us alive and functioning as long as they could during this fundraising process, but in the end, we needed new outside capital to get into production.<p>[..abridged..] see https://plastc.com/ for full copy
======
danellis
Interestingly, from a competitor's web site:
"With the acquisition of Coin by Fitbit, all business operations ceased on
June 13, 2016. The company is no longer manufacturing, promoting, or selling
any new devices or products."
I think that just leaves Swyp (which isn't launch yet) and Stratos.
------
danellis
Well that's $155 down the drain.
| {
"pile_set_name": "HackerNews"
} |
Raspberry Pi Zero grows a camera connector - Artemis2
https://www.raspberrypi.org/blog/zero-grows-camera-connector/
======
geerlingguy
With a camera connector, the Pi Zero is finally a full replacement for the A+
in all the projects that I've done/have in mind. Since it has full GPIO
without a header, it's actually easier for me to wire things up in a more
compact way for specific projects.
And since it's $5, I worry a lot less about sticking these things in areas
where moisture, temperature, and potential-for-getting-snagged-by-a-kid are
slight concerns.
For the immediate future, I hope to get one and use the adapter cable to stick
in a small box with a camera for a plug-and-play time-lapse camera. Basically,
add a knob that sets the interval on the side, then plug in a battery or plug
into microSD, and it will start dropping pictures on the microSD card until
it's powered off. Great for construction, dusty, or outdoor environments!
~~~
whyenot
I'd be interested to see how well it works. I did something similar (a camera
trap) with the model B and the problem I ran into is that the unit would often
overheat and shut down if left in an enclosure in the sun. I didn't want to
use active cooling, as the noise might scare away animals. I think the model A
or a Zero would work better.
~~~
knob
What were you running to grab the images? Where you processing with motion or
something of the sort? Anything else running on the model B?
I ask because I have about one dozen Raspberry Pis (Model B, B+, and I think
on of the latest one?) working as webcams. Yet I snap one photo per minute...
and one 60 second video every 10-15 minutes. They rsync off-site.
And they're all running in a security camera housing, under the
sun/rain/shine, in front of beaches, inside a ziplock bag.
------
iask
Cunning and small, yet elusive like Bigfoot. I can't seem to find one at a
reasonable cost.
~~~
casylum
On the blog he they stated production of the Pi Zeros had to be delayed to
build up the Raspberry Pi 3's. Now that the initial surge of Raspberry Pi 3's
is done, they stated they will build "thousands a day" until demand is met.
~~~
noonespecial
I'm not sure "thousands a day" will ever even clear the backlog of people
waiting to get their first much less keep up with the latent demand of people
who would buy a handful once they knew they could reliably get them for use in
projects.
You can claim 1000's (as in 2) and still not even make 3/4 million per year.
~~~
m_mueller
Is the backlog really in the millions? That would be impressive.
~~~
yetihehe
Yeah, when they shipped first raspberrys the demand was in millions. Now
because Zero is so small and cheap, everyone who needed pi for tinkering will
order several zeros, including me. The only thing which prevented me from
getting one was lack of camera connector.
------
alexellisuk
Very happy to see the camera connector.
Anyone interested in the exact stock count figures?
[http://stockalert.alexellis.io/](http://stockalert.alexellis.io/) along with
full source-code and blog write-up on how it works.
------
donatj
We have a very small run project (tens) we plan to integrate the zero into,
but have yet to be able to get our hands on one to even dev with, and are at
the point where we are reconsidering the decision.
While it's nice to see improvements to the board, I'd love improvement in the
supply chain.
------
superuser2
It's still baffling to me that we give Raspberry Pi all this credit for making
a $5 computer when there is nowhere you can actually exchange $5 for this $5
computer and it has remained that way since launch.
Being reliably in stock at the alleged price isn't a nice-to-have side
benefit, it's the _only thing there is_ when you claim to have vastly reduced
the price of something.
Unless someone can point to a US distributor who actually has one of these
things in stock for $5, color me unimpressed.
------
pcr0
The stock situation is disappointing. They could've launched it a few dollars
higher, and they'd still get the same demand with easier ramp-up.
------
intrasight
Wow. So the camera is now 5x more expensive than the computer. Perhaps some
day we'll get a less expensive camera.
------
1024core
Can you do some image-processing on the images captured from the camera, in
realtime? For example: motion detection, or image enhancement or some
filtering? I'm asking if the CPU is powerful enough to do a pass over the
image in 30ms... or if not in 30ms, how long would edge-detection take on a
Zero, on a full HD image?
~~~
Sanddancer
Yes, you can. There are even tutorials for doing things like motion detection:
[https://www.raspberrypi.org/blog/turn-your-pi-into-a-low-
cos...](https://www.raspberrypi.org/blog/turn-your-pi-into-a-low-cost-hd-
surveillance-cam/)
However, for the really fancy stuff, you probably wanna look into programming
for the VPU, which has some pretty decent muscle behind it.
------
stonetomb
They're trying keep mfg in the UK and the unit price-volumes are too low
(they're likely playing super conservative with their orders).
There are other elephants in the room as well...
------
jwr
Is that the third type of camera connector that the RPi designers have used
already? Or is it the same one used on the compute module development boards?
~~~
khedoros
I thought that they all ("all", excluding prototype hardware) used the same
physical connector (a 15-pin ZIF connector). As far as data lanes and such,
I'd assume that the Pi-Zero will take its cues from the non-compute-module
Pi's and run 2 lanes of data on interface 1.
edit: Ah, OK. So it's a fine-pitch side-mount connector. I didn't realize that
it actually uses different hardware.
------
Florin_Andrei
So that means new Pi Zero batches will ship with the connector on the main
board? (as opposed to connector-less earlier batches?)
Like a new version of the same hardware?
~~~
snoonan
This update adds a camera connector. The GPIO header is still not populated on
the board. This is actually great for those who plan on soldering different
kinds of connectors (right angle, female, bottom-mounted, etc)
------
ndesaulniers
sold out everywhere
~~~
alexellisuk
Check the site I posted. Can you order from the UK?
~~~
throwaway049
Your site is reporting a lot of units available but clicking thru to the
vendors shows only bundle offers, not the device alone. It would be great if
your site could distinguish between those.
~~~
alexellisuk
All the single units go within the first day or few hours of the stock being
available. The packaged sets pimoroni have are good value for money if you can
use the additional pHATs. I've used the scroll/explorer/dac pHAT.
Someone raised a PR on my todo list to breakdown by category.. it would be
more fragile - maybe done better through the Shopify API, negotiate an access
key with the store etc etc.
| {
"pile_set_name": "HackerNews"
} |
Valeant’s Drug Price Strategy Enriches It, but Infuriates Patients and Lawmakers - pbhowmic
http://www.nytimes.com/2015/10/05/business/valeants-drug-price-strategy-enriches-it-but-infuriates-patients-and-lawmakers.html?hp&action=click&pgtype=Homepage&module=first-column-region®ion=top-news&WT.nav=top-news
======
jbapple
FTA: "It says patients are largely shielded from price increases by insurance
and financial assistance programs the company offers, so that virtually no one
is denied a drug they need. But Mr. Pearson, a former McKinsey & Company
consultant, has said he has a duty to shareholders to wring the maximum profit
out of each drug."
This is the classic coupon strategy to maximize revenue - if you set the price
very high but offer discounts to those who can't afford it, you can get more
revenue than if you pick a single price that would be less than what some
buyers were willing to pay and more than others were willing to pay.
In the first example of the article, an older woman (Susan Mannes) had to take
on a second job to afford a drug that Valeant increased the price on in order
to keep her husband from dying. I don't know what the upper limit is on how
much Susan and Bruce Mannes will pay for this drug, but I suspect it is
limited only by their current assets. In other words, Valeant could feasibly
take everything they own.
I don't see a way out of this without regulation, and I don't see that
happening when the pharmaceutical industry has been so effective at
influencing federal laws.
~~~
ef4
> I don't see a way out of this without regulation, and I don't see that
> happening when the pharmaceutical industry has been so effective at
> influencing federal laws.
For many of these generic drugs, this is a case where markets have literally
already solved the problem. They're available from European and Indian
manufacturers at extremely low cost.
All we need to do is lower the trade barriers.
Remember, these are drugs that are already FDA approved. But the manufacturers
can't sell in the US because the ANDA approval process is too onerous.
~~~
jbapple
> For many of these generic drugs, this is a case where markets have literally
> already solved the problem. They're available from European and Indian
> manufacturers at extremely low cost.
This fixes part of the problem, perhaps. But the article explains that it can
take other manufacturers years to ramp up production if they aren't already
making the drug. This also doesn't fix the problem for non-generics.
------
cryoshon
Relevant is the soon-to-be-debated-in-public trade agreements regarding
"intellectual property" (a regressive and fiercely harmful concept spawned by
greed, but that's a rant for another day) as it relates to the pharmas. The
pharma companies lobbied very, very heavily for certain clauses in that trade
agreement. The TTP is going to send their profits through the roof, likely by
forcing formerly exempt countries on board with their pricing scheme.
The money that the pharmas have allows them to corrupt the political process,
enabling them to abuse patients/insurers with insane prices... this isn't new.
The incentives are completely fucked at every level of organization: drug
development, clinical trials, patient care.
As far as the claim that patients are largely shielded from price increases, I
think that claim was made in bad faith, and is laughably disconnected from
reality. Medical-debt induced bankruptcy is very alive and well in the US, and
is not in other places.
~~~
curiouscats
Very true. The medical-debt induced bankruptcy is the most visible sign but
don't ignore the absolutely huge sums diverted from paying employees to paying
for health care
The USA pays more than twice what the rest of the rich world does for health
care for no better outcomes. This isn't only due to drug pricing but drug
pricing obviously cost USA consumers tens of billions a year in increased
health care costs just because it means they pay much more in insurance costs
(or even less noticeable have part of their paycheck diverted to pay inflated
drug prices impact on health care costs) doesn't mean they avoid the high
costs of this corruption based pricing.
It is very unlikely this system is going to be changed. It has been going on
for decades at huge costs to the USA. Maybe (I just maybe) it will be slowed
some if the public forces politicians to reduce the amount of the paybacks
given by the Democrats and Republicans to drug companies at the expense of
everyone else. Meaningful change is very unlikely. The parties are far too
corrupt at this point to make any significant change in policy likely.
------
steve19
I can't help but think these drug companies are playing a very dangerous game.
The last thing they want is regulation, but of they keep this up that's what
they will get...
~~~
stingraycharles
What I'm wondering about, however, as a European who never actually sees his
own medical bills, how this will affect me or the country I live in. Will the
powers that are in charge of buying medicine just suck it up and pay the bill?
Will they use the power of the collective to get a better deal? How does this
work?
~~~
visarga
I think countries could first try to legalize cheap generics of that drug, as
it already happens in India. But then an economic war would ensue. Each
country would counter sanctions with other sanctions. Then negotiations would
come, where a middle point would be sought between the trade blocks, similar
to WTO, or patent portfolios and non-aggression pacts.
~~~
curiouscats
Yes and the factors are the costs of accepting the government granted monopoly
pricing (free markets don't have government saying you can't sell this thing
because some other people have gotten the government to say only they can sell
it) and the costs of saying we don't accept the current patent system.
The costs of fighting the existing copyright/patent cartel led by the USA is
very high. Countries want to avoid this and if the cost is diverting a few
percent of their GDP to the USA's drug companies they likely will do so. If
however that diversion because too great they will have huge incentive to
decide the cost is too high and we have to fight the USA's governments
policies put in place at the behest of their drug companies.
The costs will be enormous to the countries that fight, especially at first.
The USA is enormously rich and can accept losses internally and use economic
power to punish those defying it's dictates.
If several rich countries decide to align with India (which seems to be by far
the leading advocate for fighting the copyright/patent cartel) costs to the
USA will also get step. At some point step enough that others force the
government to stop using the rest of the USA's economic clout to benefit the
drug companies.
But I think it is very unlikely the other countries could sustain the losses
long enough for the USA to cave. It is a big if, but if China grows well
economically for the next 20 years it is much more likely India and China
together could work together on this to bring together a large coalition.
I don't think China and India would work together as strong allies but they
could come together in this area if the USA continues to demand large
percentage of countries GDP be diverted to the USA drug companies.
All these types of things get complicated. It isn't as if the Drug companies
don't understand this same path. They will obviously seek to use corruption in
other countries as successfully as they have in the USA to get those
governments to act in their interest for cash payments that are peanuts
compared to the costs those governments will place on their citizens to
benefit the drug companies.
The payments to corrupt political parties doesn't only work in the USA, it
works most everywhere. So don't count out the insane movement of wealth into
drug companies coffers changing anytime soon.
If you want to be practical buying stock in these companies will likely reward
you well. That is what I have done. I can understand if you think it is
repugnant to profit from corruption in the political system on such a massive
scale (very little corruption ever impacts multiple percentage points of GDP
but, as this does). That is likely a higher ethical decision, weighing my
options I accept that "hit" to my karma. If I were given the authority to
decide the policy I would make a decision that would harm my stocks but I am
not given that option. And based on my estimates of likely future outcomes
betting on the USA political parties staying very corrupt in relation to
providing benefits to drug companies seems a very likely bet.
------
pweissbrod
Personally I see this as a result of the incentives for the drug makers to be
completely disconnected from the people that depend on their product.
Imagine other markets were similar: one company purchases the rights to
manufacturing all laptops and now you must pay $10K for the same thing you
used to buy for normal prices last week.
Its interesting to consider how many people's lives would be greatly improved
(or even saved) if vendor exclusivity rights were reduced in this market.
~~~
pweissbrod
"For now, with Congress in the hands of Republicans and election season in
full swing, quick government action on drug prices is considered unlikely."
(Quote from the article) Not that I associate with any political party, but it
confounds me why a political party would like to have a reputation for
upholding such extreme corrupt corporate opportunism.
~~~
protomyth
Both major parties have their favorite cronies. Although you could shorten
that statement to "election season" and expect the same outcome.
------
prodmerc
Huh, I just realized that the Silk Road or a similar online black market could
literally save lives in this case.
People could sell generics and alternatives that are not yet approved by the
FDA or whatever local authority at a much lower price, everyone wins (except
the government and Big Pharma).
Though there would be a lot of fraud I presume, sellers with fake pills and
the like...
~~~
mdpopescu
If fraud wasn't a problem for hard drugs, I doubt it will be significant for
these.
~~~
prodmerc
There were cases of fraud, of course, but the review system apparently worked
ok (never used it myself).
However, dealing drugs that literally save one's life is kind of a big deal -
one month's supply of a fake or impure medication could be deadly, unlike with
shitty MDMA and meth...
------
dangerlibrary
Just give Medicare and Medicaid the ability to negotiate and publish what they
pay for drugs. Right now they are forced to accept a (self reported) average
market price for every drug they buy.
They buy over half the prescription drugs in this country. They can be a bully
and negotiate great prices for themselves. And once the prices are public, see
if any insurance companies are willing to accept a 5000% premium over the
Medicaid price.
~~~
refurb
Medicaid and Medicare are very different beasts. Medicaid takes 23.1% off the
top of most drugs and typically pays the _least_ out of any insurer (including
the private ones). There is even a "best price" provision that says Medicaid
either gets a hefty discount or best price, whichever is lower. I wouldn't
worry about Medicaid.
Medicare (for physician administered drugs) pays the average of all the
private payers, which isn't a bad way to do it. For prescription drugs, it
pays list price, except for the Part-D private plans which already negotiate.
To be honest, allowing Medicare to negotiate is a little bit like any company
"negotiating" with the gov't. It's a take it or leave it proposition and is a
more like "price by fiat" than a negotiation.
Having the federal gov't negotiate drug prices is not necessarily a panacea.
The largest US insurance companies cover 30-40M lives, that larger than the
population of Canada. Yet, they pay more for drugs than Canada does. You know
why? Because the Canadian gov't makes them (back to the "price by fiat").
~~~
dangerlibrary
You're right about Medicaid paying the least and having very favorable pricing
in general. Also repayment rates vary wildly by state, etc. But the baseline
rates are still based off the reported average price. Drug manufacturers know
that the government will always pay some fraction of sticker price, regardless
of the volume the average price is based on. CMS can't negotiate in the face
of these dramatic price hikes.
And yeah, it's price by fiat, through the market. Welcome to government
contracts - plenty of people have become very rich in this space, and I have
no sympathy to those that complain about their slim profit margins on
incredibly high volume transactions. It's a big job. Get rich doing it, or get
out of the way for someone who will, because it's got to get done.
------
smoorman1024
This strategy only works because its so expensive and time consuming to bring
a competitor drug to market. If it was a lot easier to get a competitor drug
approved and to market then this strategy wouldn't work. Alex Tabarrok has
some enlightening opinions on the FDA:
[http://marginalrevolution.com/?s=FDA](http://marginalrevolution.com/?s=FDA)
~~~
jobu
Most of these went off patent years ago though. Why is it so hard to bring a
generic drug to market? The drugs already have studies proving their efficacy,
so the only issue should be a clean manufacturing process.
~~~
wiredfool
In some cases, the generics have to prove that they have the same
effectiveness/bioavailabilty as the brand name. The brand names have been
restricting the distribution of the drugs to prevent this from happening
easily.
------
rdancer
The cost of living in India being much lower than the US, what prevents people
from engaging in medical tourism for prescription drugs combined with a nice
trip abroad?
I think ultimately this is what will keep caps on the price that can be
charged -- if your treatment plan costs you $1,800/month, and a return airfare
from the U.S. East Cost to Delhi is $600 return on a budget, I'm surprised
people are not booking trips _right now_.
------
jbandela1
Looking into it, according to
[http://www.drugpatentwatch.com/ultimate/tradename/CUPRIMINE](http://www.drugpatentwatch.com/ultimate/tradename/CUPRIMINE),
Cuprimine is not covered by patents, so all the talk about intellectual
property and evil patents is moot point.
We have 2 main competing interests. 1\. Creation of new treatments 2\.
Affordability of treatments
At the same time, we should have a minimum burden on the free market in terms
of regulation.
I believe we can do this with the following.
1\. For all non-patent drugs, Medicare signs a 5 year contract to supply the
drug at a given cost. This will allow off-patent drugs to be available for low
expense.
2\. For drugs covered by patent, Medicare should negotiate a price based on
differential Quality adjusted life year, with the price per quality adjusted
life year set by law. For example, if we set the price for quality adjusted
life year to $100,000, then Medicare should pay up to $100,000 for a treatment
that gives a person 1 quality adjusted life year.
Drug companies are allowed to sell to others and negotiate a private sales
without other regulation on what price they charge.
I want the pharmaceutical industry to be one with outlandish rewards because I
want very smart people to be thinking outside the box and taking risks with
they try in pursuit of cures for disease. Even the rewards to life-saving
medication should have outlandish rewards. If we continue to see outrage at
pharma about the price of life-saving treatments (see Solvida), we will see
companies shift resources away from such treatments to more elective drugs
such as sexual enhancements, requiring less sleep, etc where they can still
have great rewards without the moralizing by critics.
------
elipsey
I think one of the big dreams of the "biotech revolution" that has yet to
materialize is to reduce barriers to entry in drug manufacturing so that drug
companies have less market power, at least for drugs/jurisdicitons that are
not encumbered by IP. Fingers crossed...
Isn't it weird that we can custom order synthetic cathinones from China for
$100, but apparently no one has tried this with theruputic drugs?
| {
"pile_set_name": "HackerNews"
} |
A report about a vulnerability in Telegram - sadghaf
Recently, we found a concerning security bug in the widely-used instant messaging tool: Telegram. In order to explain the criticality of the issue we decided to publish it on our blog.<p>We are security researchers from Iran and as you may know, Telegram holds the position of the most popular instant messaging tool in the middle east (consequently in our country).<p>The blog link: http://www.sadghaf.com/
======
dsacco
Please don't yell fire like this. You haven't found any critical severity
vulnerability in Telegram. As written, your report doesn't disclose a
vulnerability at all. What you found is a logic error with no evidence of a
security implication. You're attempting to promote yourself using invalid
findings that, as presented, would be quickly rejected by the Telegram team as
not applicable.
Your video details two findings: 1. the ability to empty a contact's internet
balance by sending very long messages to them and 2. the ability to cause a
contact's client to potentially crash due to an unexpected number of bytes in
a single message.
The first finding is neither a Telegram nor a security issue. Does Twilio have
a critical security vulnerability because I can use it to quickly exhaust a
user's SMS quota for the month? This is an established precedent, and you have
not identified a technical security flaw.
The second finding is a legitimate bug, as there is behavior that is
implemented differently than the documented design goal of the API. However,
you did not provide evidence that this finding can be used to cause a
persistent denial of service. Does the application crash every time a user
opens it, or is this good for one use? This is not a vulnerability unless you
can demonstrate an overflow allowing local memory reads/writes or a persistent
denial of service condition.
This is attention seeking behavior. Publicizing a "critical" vulnerability in
a high profile application based on the flimsy excuse that you couldn't find
an explicit disclosure email address reduces the credibility of responsible
disclosure and legitimate security research. If you had actually bothered to
search instead of rushing to bring your "findings" to notoriety you would have
found [email protected], which is explicitly for security reports. But
that wouldn't have allowed you to make a blog post and submit it to HN, would
it?
The next time you think you've found a vulnerability, don't publicize it and
try to disguise it as a noble gesture by saying you're not going to "pinpoint"
the vulnerability when you clearly walk through the exploitation in your
video. Report your findings directly to the vendor, and if the vulnerability
is valid and a fix is pushed due to your participation, then you can brag on
HN about it.
------
jupenur
I'm not sure you know what "critical" means—this bug definitely isn't it—but
you should probably report it to Telegram instead of posting on HN. You claim
you couldn't find a way to do that, but they have a security@ address [1], so
you can't have looked very carefully.
[1] [https://telegram.org/faq#q-what-if-my-hacker-friend-says-
the...](https://telegram.org/faq#q-what-if-my-hacker-friend-says-they-could-
decipher-telegram-mes)
------
jupenur
An interesting little detail about the authors:
[https://www.fbi.gov/wanted/cyber/iranian-ddos-
attacks](https://www.fbi.gov/wanted/cyber/iranian-ddos-attacks)
------
edward_johnson
Telegram team always say they are too secure, it is interesting that they have
such a critical bug
~~~
alexisJS
They also have a contest for proving secure, have they won the contest?
~~~
edward_johnson
If by contest you mean the link given below, It is just for cracking Telegram
encryption, so I don't think they have won it.
[https://telegram.org/blog/cryptocontest](https://telegram.org/blog/cryptocontest)
| {
"pile_set_name": "HackerNews"
} |
Intro to funding: Friends and Family - mattculbreth
http://www.burningdoor.com/askthewizard/2007/03/introduction_to_funding_friend.html
======
jwecker
I know a man near San Diego who started a business and borrowed from friends
and family. The business didn't take off like he wanted and he got very
depressed- feeling like he had let down all of those friends and especially
family. He took his life shortly afterwards which was obviously the last thing
any of us would've ever wanted.
Make sure that those who invest understand what kind of a risk it is. If you
ever catch yourself saying to someone close that you don't see how it could
fail, or offering a huge rate of return and yet saying that it will almost
definitely be successful- then you yourself do not understand the risk and
probably should probably take a reality pill. Oh, also, if you ever find
yourself accepting an investment from (usually a family member) someone who
will absolutely need that money back in 6 months or else their standard of
living will be adversely affected- you're doing something wrong- you are not
managing risk as you should.
~~~
mattculbreth
Good post. Yeah I wouldn't do this unless two things were both true: 1) The
person could afford to do without the money for quite some time, 2) I was
pretty sure it would succeed.
It's awfully hard to gauge #2 when it's your idea and you've got yourself
emotionally wrapped up in it.
I'm beginning to think there's an entire class of startup funding that needs
to be written about a bit more--spouse support. If you're married and your
spouse can earn enough money to pay for the household for a discrete amount of
time then that's perhaps the greatest bootstrapping source imaginable. Seems
like a few of us here (from other posts I've seen) are considering this.
------
r0b
For the serial entrepreneurs among us: it's important to keep in mind that you
only get one shot with friends and family money. If you don't succeed, and
decide to move on to another idea, your friends and family will be very
reluctant to invest in yet another venture. This can be problematic if you're
the sort of person who likes to jump from idea to idea.
I financed a company a few years ago with a small amount of money from family,
and ultimately it didn't take off. I'm now very hesitant to approach them
again. I wish I had been a little more cautious before deciding to take that
route...
------
ced
Maybe it's better to take a loan from them with clearly defined terms. That
way, if all fails, you can get a job and pay it back. No emotional damage.
| {
"pile_set_name": "HackerNews"
} |
Logs all the opened websites in the Chrome. Even incognito - unknownymouse
https://github.com/SkrewEverything/Web-Tracker
======
binarynate
I'm not sure if the HN community has a code of ethics, but it seems like we
shouldn't be promoting spyware.
~~~
unknownymouse
If you know spyware... you can tackle spyware
| {
"pile_set_name": "HackerNews"
} |
Arduino Announces New Linux Boards in Collaborations with Intel and TI - emilepetrone
http://makezine.com/2013/10/03/arduino-announces-two-new-linux-boards/
======
jws
Intel Galileo: "Pentium class" 32 bit Quark SoC X1000 with an Arduino header.
November 29th.
TI Arduino TRE: 1-GHz Sitara AM335x processor plus an AVR processor for the
Arduino side. Includes the Arduino sockets and it looks like an X-Bee socket.
This looks like a Beaglebone combined with an Arduino. Ships sometime in
Spring of 2014.
Of the two I find the TRE more interesting. Precise control of pins is
difficult in Linux where you could be preempted and holding the CPU and
counting cycles is frowned upon.
BTW: If anyone invents a time machine, please go back and get the original
Arduino header alignment fixed.
~~~
joezydeco
If you need microsecond or nanosecond-level control of I/O, you're probably a
little too advanced for an Arduino-type environment.
If you do, the AM335x offers a subsystem called the PRU that lets you program
I/O that runs independently from the core. Kind of like, hmm, a bunch of
Little Arduinos inside the Big Arduino.
[http://www.ti.com/lit/wp/spry136a/spry136a.pdf](http://www.ti.com/lit/wp/spry136a/spry136a.pdf)
~~~
jack12
The thing is, the only tool provided for using the PRU is an assembler[1]. So
trying to make an AM335x into an "Arduino" required at a minimum: a friendlier
real-time IO system / PRU toolchain, the creation of an Arduino sketch and
library API compatibility layer, and circuitry to deal with 5V IO.
Solving all that by just bolting an ATMega beside the AM335x isn't the least
bit elegant, but Arduino has never seemed to be too bothered about elegance.
_1:
[https://github.com/beagleboard/am335x_pru_package](https://github.com/beagleboard/am335x_pru_package)
~~~
joezydeco
But the reality is that at gigahertz speed you could run the TRE's I/O from
userspace and have pretty accurate timing. Just latch your LEDs. =)
------
digikata
Working on the recent expansion of embedded computing power has been like
watching PCs develop all over again. Going from predominantly bare-metal
software, to a mix of light RTOSs, and then full blown OSs. CPU Bit growth
form 6 to 64. Peripheral speed and selection has grown too. That spectrum has
always existed, but now that the industry focus isn't as strong on desktop
CPUs, there's a vast current of progress being applied to the embedded side.
The capabilities have grown to the point that I'm not even sure if embedded is
the right long term name for this type of computing anymore.
------
ChuckMcM
Fascinating announcement. I love the boards, they have an interesting mix of
hackability and programability. And they are clearly in response to the
Raspberry Pi success (although much of that is the price point, RasPi type
systems have been available for $200 - $300 each for a long time and don't get
the traction of a $35 system :-)
I found the Arduino Yun [1] to be perhaps the most interesting board
(nominally $52) which is a very inexpensive Linux setup with an Arduino
adjacent. You can even load sketches over WiFi. Its interesting because you
have a much more sophisticated computation engine acting as a glorified serial
port for an 8 bit micro.
The parallels to the original microcomputer explosion cannot be over looked.
The Altair 8800 was introduced as a "real computer" using chips designed for
calculators and embedded systems which engineers could build and use much more
cost effectively than the minicomputers available at the time. Here we have
Arduinos which have an approachable level of complexity and a cost (< $50
seems to be the trigger point) which makes them useful in the
education/hacking role.
On the specific announcements the TRE is the most interesting to me as I've
got BeagleBone Black systems (one is running the Hue lights in my office) to
Arduinoess and Pies and the "sweet spot" is definitely somewhere in the
middle. Not enough pins on the Pi, not enough compute oomph in the Arduino. A
number of things then are a blend of an Arduino and a larger system, so the
TRE, the Yun fit that niche which the TRE being a more programmable on the
Linux side version of the Yun.
[1]
[http://store.arduino.cc/ww/index.php?main_page=product_info&...](http://store.arduino.cc/ww/index.php?main_page=product_info&cPath=11_12&products_id=313)
------
spongle
Now the simple has been made incredibly complicated and it's virtually
impossible to take a design based on these to production for the amateur!
~~~
andyjohnson0
Seconded. I personally prefer the simplicity of the existing arduinos, but if
someone needs one of these boards then why not use a raspberry pi or
beaglebone that have mature ecosystems?
~~~
acadien
I suspect these boards are being released in response to the rasberry pi,
although I can't say for certain they're an appropriate replacement. Besides
its not as if they're ending the old line of arduino boards, so if you don't
like the new ones... just don't buy them :)
~~~
andyjohnson0
I'm sure you're right about the new boards being a response to the rpi. I just
have mixed feelings about the possibility of competition between two open
source hardware systems. Maybe it'll produce innovation, or maybe just
duplication of effort.
_" if you don't like the new ones... just don't buy them"_
I didn't say I disliked them. Since I didn't make myself clear: I do wish them
well.
------
codehero
Since the days of the EspressoPC I have always been fascinated by small scale
computers. With so many choices in small linux systems (Gumstix anyone?) I am
not enchanted with seeing the Arduino name slapped on a Linux board. I was
appalled at how digitalWrite was actually implemented, so I cannot imagine
what kind of layers they put between the programmer and the metal.
~~~
antimagic
Yup. There is no argument that modern systems are more flexible, but that
flexibility comes at the cost of needing to understand a vastly more complex
system.
I recently discovered an emulator online of my first 8-bit computer (the
Compucolor II), and I was amazed at how simple it was to use (draw graphics to
the screen, read from the keyboard or disk etc). Whilst you have to write all
of the high-level processing yourself, the actual low-level magic is right
there at your fingertips. Want to set a pin on the serial port - not a
problem, the I/O is memory mapped, and the address you need is in a table in
the 100 page manual that came with the computer. Done! Compare that with
today's environments - find out the API needed to access a serial port pin (if
it even exists!), pull in the system headers and libraries necessary (after
figuring out which are used!), modify your build system to use them, and then
run.
The beauty of Arduino is that it takes you back to that bare metal simplicity.
It makes interfacing to the real world very easy. If you want to run a
webserver it's probably not the best choice, but if you want to switch off
your washing machine because a sensor has detected that a pool of water has
formed on your kitchen floor, Arduino is so much more accessible for an
amateur. I really don't understand where the manufacturers are going with
these boards...
~~~
malandrew
Good comment. Reminded me of something I'm really hoping to see come out of
the arduino/raspberryPi/Beaglebone movement is a fresh look at a LISP machine.
Given how cheap a computer is now, what's stopping the existence of a
simarlarly size and priced Alpha-based chipset that can run OpenGenera or
other LISP machine?
~~~
pjmlp
I bit off topic, but the Smalltalk guys are doing exactly that.
[http://squeaknos.blogspot.pt/](http://squeaknos.blogspot.pt/)
------
ChikkaChiChi
If Galileo is priced competitively to the BeagleBone we've got a whole new
game on our hands.
Intel had released the Minnowboard ([http://arstechnica.com/information-
technology/2013/09/199-4-...](http://arstechnica.com/information-
technology/2013/09/199-4-2-computer-is-intels-first-raspberry-pi-competitor/))
which seemed like a competitor to the Raspberry Pi, but at 200USD it wouldn't
see nearly the uptake the more affordable systems have seen.
This reminds me of the Netbook arms race except the end product is infinitely
more useful.
~~~
tesseractive
> This reminds me of the Netbook arms race except the end product is
> infinitely more useful.
Netbooks have tremendous utility to a subset of users. I have an old netbook
with one of the slowest CPUs released in the past decade (Atom Z520), but it
gets great battery life, and runs Emacs, Python, and a number of other tools I
need like a champ.
Is that what most people want? No. But most people aren't microcontroller
hardware hackers either.
------
lhl
I have an UDOO[1] Quad ($135) coming in very soon which is similar to the TRE
- basically an SAM3X8E (Due) Arduino (w/ compatible headers) on a single board
but combined w/ a much beefier i.MX6.
I have a Beaglebone Black and a Wandboard Quad sitting on my desk right now
and I'm sure I'll end up grabbing the new Arduinos when they come out,
although Spring 2014 is pretty far away and I suspect the UDOO will end up
doing everything I want soon.
[1] [http://www.udoo.org/](http://www.udoo.org/)
------
jlgaddis
For those of you into the Arduino stuff, what are you making with these boards
(the ones in the article, the Beagle* boards, even the Pi)? I've read a bunch
of the "Build an 'x' with your RPi!" articles but I'm interested in hearing
about some "real-world" things others have built.
------
gcb1
maybe with TRE people will finally be able to something banal as those ambient
backlight without having to buy chinese hdmi splitters!
~~~
DonGateley
Indeed. Stereo audio in and out, assuming modern resolution and rates and
sufficiently flexible ports (mic-in to line-in sensitivity and line-out to
phone-out power), with plenty of horsepower for DSP and this aligns really
nicely for audio projects. I've got all kinds of things I want to do with one.
------
sbierwagen
No pricing? Pass.
~~~
01Michael10
You make the decision to pass on something before it's even available and all
the information on it is released yet. Interesting...
| {
"pile_set_name": "HackerNews"
} |
How Will the H-1B Ban Impact Technologists' Plans? - WrightStuff
https://insights.dice.com/2020/07/21/will-h-1b-ban-impact-technologists-plans/
======
droitbutch
Depends largely on whether they also restrict import of digital products. If
they do (which will extremely difficult) then American tech salaries will
generally rise. However, because of the difficulty in restricting data and
communicating with offshore destinations, I suspect this will lead to more
digital jobs going offshore.
| {
"pile_set_name": "HackerNews"
} |
Rats free each other from cages (2011) - CarolineW
http://www.nature.com/news/rats-free-each-other-from-cages-1.9603
======
xg15
I found the counter-argument rather strange:
> At issue is the definition of empathy. “This work is not evidence of empathy
> — _defined as the ability to mentally put oneself into another being’s
> emotional shoes,”_ says Povinelli. “It’s good evidence for emotional
> contagion and that animals are motivated to coordinate their behaviour so
> that distress is reduced, but that is nothing new.”
(Emphasis mine)
How could this definition of empathy possibly be proven for non-humans? Or in
fact, how could it be proven for anyone but yourself?
Additionally, they present an alternate hypothesis to explain the findings:
That the captive rat was somehow causing distress to the free rat and the free
rat was solely motivated to reduce that distress without any concern for the
other rat. While that's a valid hypothesis, it doesn't seem to hold the
occam's razor test for me - especially as they admit themselves that they
don't know what this distress-causing mechanism could be. (They suggest alarm
calls or pheromones but couldn't find any yet)
~~~
randcraw
How to demonstrate empathy? Easy. Conduct experiments in which the caged rats
undergo a range of living conditions from idyllic to neutral to outright
torture. If the free rat favors freeing the tortured rat over the idyllic rat
then it shows greater empathy.
~~~
xg15
That was, in (slightly) less horrible form, what they did. It's apparently
still disputed that this shows empathy, however.
For your experiment, you could counter-argue in the same way: Maybe the free
rat was just particularly tired of the torture screams and that's why it freed
that rat first...
------
semi-extrinsic
Actual paper:
[http://science.sciencemag.org/content/334/6061/1427](http://science.sciencemag.org/content/334/6061/1427)
There is a correction issued for this paper, but the correction is paywalled
so I can't tell whether it's substantive or a nitpick.
Also, please put a (2011) in the title.
~~~
throwitawayday
The correction:
“Empathy and pro-social behavior in rats” by I. B.-A. Bartal et al. (9
December 2011, p. 1427). On p. 1428, the last full paragraph of column 1 was
incorrect. The paragraph should be replaced by this corrected text: “All
female rats (6/6) and most male rats (17/24) in the trapped condition became
door-openers. Female rats in the trapped condition opened the restrainer door
at a shorter latency than males on days 7 to 12 (P < 0.01, MMA, Fig. 3A),
consistent with suggestions that females are more empathic than males (7, 12,
13). Furthermore, female rats were also more active than males in the trapped
condition (P < 0.001, ANOVA) but not in the empty condition (Fig. 3B).”
~~~
semi-extrinsic
So they used 30 rats in total?Any numbers on how statistically well-powered
this was?
~~~
dogma1138
30 rats are plenty if they selected them properly - different gene pools,
different breeders etc...
~~~
saalweachter
You wouldn't necessarily care about genetic diversity for this sort of
experiment. This experiment isn't trying to say "all animals" or "all rat
species" exhibit this behavior; the interesting thing is that _any_ animal
does.
~~~
tremon
For me, the more important question is not whether all test subjects were
genetically diverse: it's more important that the free rat and the caged rat
are dissimilar. The situation would be vastly different if the rat pairs were
related, since there is a genetically selfish motivation for caring for those
within the same gene pool.
~~~
jonlucc
This becomes apparent when you try to co-house rats and mice. Litter mates are
much less likely to fight.
------
pipio21
My perception for watching the video is that the cage opener wants to mate
with the rat inside once it is free. At least mount it.
If there are from opposites sexes, there is certainly an egoistical reason for
the rat to free the other.
If they are not, rats are very sociable. They need close contact with other
rats.
It also looks like the opener rat is also very curious and wants to enter the
cage too to know how it feels inside.
~~~
adevine
That was my perception as well: the rat immediately tried to mount the rat
that had been trapped.
My issue is not with the results of the study, but with interpretation. I
think it's possible to explain the desire to free the other rat purely in
selfish terms without any nods to altruistic or empathic behavior.
~~~
MrBra
Reproduction is not selfish. It's for species survival.
~~~
technothrasher
Reproduction is not for species survival. "Species" is simply an arbitrary
label we've assigned as part of the system we use to categorize living things,
and doesn't even really match reality very well.
Reproduction is for gene survival. We are simply throw away machines our genes
have developed in order to propagate themselves.
~~~
MrBra
No. Everything keeps transforming at each iteration for better serving the
system purpose: efficiency (for this system specific set of rules).
The source code (genes) propagation per se would be completely useless to the
system, without the processes they _express_ : living systems, the most
efficient energy transforming machines the system has been able to
incrementally arrange up so far starting out of random stuff.
Therefore reproduction is exactly for that, for allowing the continuation of
this energy transforming processes (life processes) in the most efficient way.
------
tim333
See also: Rats Remember Who's Nice to Them—and Return the Favor
[http://news.nationalgeographic.com/news/2015/02/150224-rats-...](http://news.nationalgeographic.com/news/2015/02/150224-rats-
helping-social-behavior-science-animals-cooperation/)
------
gweinberg
Were the imprisoned rats male or female? If I were a male rat, I would likely
free an imprisoned female rat, but I'd probably think a male rat was being
justly punished for his crimes.
~~~
jonlucc
According to the correction posted above, 6 were female and all females became
openers, while only some males became openers.
~~~
gweinberg
It only tells the sex of the rats on the outside. It doesn't tell the sex of
the rats on the inside.
------
smoyer
If you were observing humans but couldn't interview them, could you say they
were acting with empathy? There are all sorts of "acts" that appear to be
empathy from the outside but end up being in both parties best interest. Even
acts of heroism and selflessness end up helping to hold together our society
but it's often hard to feel what the other is feeling. Even when I consciously
try, I suspect my attempts at empathy fall short - but it's still important to
try.
~~~
FeepingCreature
Why do you think you feel empathy to begin with? It feels good _because_ it
helps on net. Your brain is a product of evolution; when something feels nice,
there's usually a pragmatic reason for it.
~~~
jpttsn
Because it helps _the species_ on net. Also, you know, nipples on men.
Evolution isn't that pragmatic.
~~~
pluma
Evolution doesn't care about arbitrary categories like "species". Species
aren't even a thing, they're an abstraction.
------
jboydyhacker
Plato defined consciousness using a test of whether something could understand
the cause and effect of its actions. By that rule, a lot of animals would
qualify as sentient- Rats/cats/dogs and virtually all critters.
------
grondilu
Kind of related:
[https://www.youtube.com/watch?v=iriBuIunNYM](https://www.youtube.com/watch?v=iriBuIunNYM)
~~~
fennecfoxen
... come now! I was expecting at _least_
[https://www.youtube.com/watch?v=eWhzy-
lynu4](https://www.youtube.com/watch?v=eWhzy-lynu4)
------
mudil
It all goes down to the origins of morality. Politicians and clerics often
repeat the mantra of religion and society creating a moral code for us, but
observations and experiments with animals (and even plants) show that
altruism, empathy, and cooperation spontaneously arise from the evolution.
Excellent book on the subject is "The Bonobo and the Atheist: In Search of
Humanism Among the Primates" by Frans de Waal.
------
auganov
Sort of off-topic but: What companies (or researchers?) [if any] are working
on animal-related technologies? I.e. animal eeg, [non-decorative] animal wear,
apps for animals etc.
It deeply fascinates me but seems like it's somewhat unexplored in general. I
only remember some company working on VR for animals.
~~~
jonlucc
In the research world, there are tons of machines aimed t measuring everything
about rodents. There are implants that measure blood glucose levels every few
minutes, treadmills that spit out every parameter about how a mouse walks, and
MRIs and CTs just for rodents. These aren't exactly _for_ the rodents, but
they do exist.
------
rurban
Everybody knows that for ages already. Rats (like other people-forming races,
distinguishing themselves by odor: ants, bees, ...) are extremely social and
extremely racist. They don't form a group, a pack, like wolves, where everyone
knows each other, and they don't live in anonymous masses, like migratory
birds, locusts or school of fish.
E.g. Konrad Lorenz in "Das sogenannte Boese"
([https://en.wikipedia.org/wiki/On_Aggression](https://en.wikipedia.org/wiki/On_Aggression))
had a whole chapter on rats and why they developed this behavior, which is
often misleadingly labelled ethically. Like here.
------
4lejandrito
I find it odd that anyone able to cage rats for days in that cruel way is
going to be able to identify any kind of empathy at all.
How can you identify empathy if you don't even feel it?
I guess this is how humans progress but I can't help feeling something wrong
with it.
~~~
ceejayoz
> How can you identify empathy if you don't even feel it?
I can detect wind (including its strength and direction) outside my house even
if I can't feel it.
~~~
4lejandrito
You have a point but what I was trying to express is the irony of the
experiment. To see if animals have empathy we basically torture them and see
how they react...
Call it empathy or whatever but I know I suffer under certain situations and
can't see why a rat does not. They have 2 eyes and four limbs just like us
maybe just a little bit less computing power.
------
jadbox
This reminds me of the movie Secret of Nimh.
~~~
Agathos
My first thought was Pinky and the Brain. And somebody else just posted the
necessary Douglas Adams quote. Come to think of it, our culture seems to dwell
on the fear that our lab rats are smarter than we are...
------
csomar
Just looked at the video. There is little evidence that the rat opened the
cage to free his mate. My guess is that it opened the cage to check what's
inside. Notice that as soon as the cage is opened it quickly entered inside
even before his mate left it.
~~~
gerbilly
The researchers found that rats rarely opened empty cages, or cages containing
a toy rat.
~~~
nxzero
(Maybe the real rats smelled like food, were a sign of food, etc.)
~~~
gweinberg
Maybe, but in that case you'd expect more male rats to be cage openers than
females.
~~~
nxzero
Not following, what's the link between male rats, opening cages, and foraging
for food?
------
known
Scary if they're so smart :)
------
grecy
I feel certain that in the near future we will discover animals are _a lot_
smarter and more emotionally developed than we currently realize, and future
generations will look back on our treatment of them with disgust.
~~~
d0100
If an animal can't communicate complex thought, he is just an animal. No
matter how "smart" or "emotionally developed" they appear.
Until then it's just instinct.
And if the day ever arrives, it can just be seen as an evolutionary step for
the species, and not as proof that they where intelligent all along.
~~~
fao_
Define 'complex', and 'communicate'. Are deaf and blind people "just animals",
no matter how -your quotes- "smart" or "emotionally developed" they appear?
~~~
d0100
No need to stretch what I said. Communicate has a well defined meaning: share
or exchange information, news, or ideas. And 'complex' was used to exclude
what I consider instinctual feelings/thoughts: anger, hunger, love, happiness,
obedience, etc.
The deaf and blind can communicate. And even in the case of a human vegetable,
a broken table is still a table.
~~~
fao_
I'm actually not stretching it, I was demonstrating that your limitations
either exclude humans or include animals.
What your limitations seem to ignore is that not only does most of our day-to-
day behaviour fall under what you consider instinctive, but also that you seem
to be ignoring the fact that many animals communicate in languages you might
not necessarily be able to understand (Dogs, for example, communicate mostly
in body language, looks, etc.).
Can you state an example of a complex thought? I can think of a fair amount of
thoughts I would use to denote intelligence that would fall under what you
consider 'instinctual'.
------
projektir
> The idea that we should have moral compunctions about biological reality
> seems absurd to me
Why? We seem to have plenty of moral compunctions about lots of other
biological realities. Otherwise, why do anything at all?
As humans, we have so far not acted in a manner that implies we're completely
OK with biological realities. We keep trying to change them. Hopefully, it
will continue this way.
Most people agree that they do not want to suffer. That's what makes it evil.
It's not a difficult concept.
~~~
Amezarak
> Why? We seem to have plenty of moral compunctions about lots of other
> biological realities.
Do we, in practice? Can you go into more detail?
> Most people agree that they do not want to suffer. That's what makes it
> evil. It's not a difficult concept.
Did we ask the animals too? Maybe the pig consensus is different.
If most people agree they do not want to pay taxes, do taxes become evil?
There is no objective reason to consider 'most people agree on x' as some kind
of objective moral imperative. There is no moral quality to caring what people
want. I don't think this is a very sound basis for an objective morality.
EDIT: What I'm trying to get at is basically this. You're saying that eating
meat (at least meat that necessitates the killing of animals) is morally
wrong. You're saying this because you want to persuade me and the rest of your
audience that eating meat is morally wrong. But you are not a moral authority;
I consult my conscience, which feels entirely guilt-free about the subject,
and my appetite, which feels a positive obligation to eat meat. You're simply
trying to exploit a "hack" in my social animal brain, which is that if you
tell me that something is wrong, you are telling me that other members of the
human herd will judge me unfavorably and perhaps ostracize me or otherwise
punish me.
Some people, but not very many, will be swayed by your attempted brain-
exploit. But I, again, observe that my conscious feels guilt-free, which is a
very strong sign that the people I am surrounded by do not, in fact,
disapprove of my eating meat, and also that the vast majority of people I know
actually eat meat. So somewhere in my hindbrain, I realize that you are (from
my perspective) lying to me, and it will make me instinctively resentful of
you, because you have marked yourself as a member of The Other.
So your 'moral' argument is, objectively, amoral; and for your purposes,
actually counterproductive, useful only when a person is surrounded
overwhelmingly by vegans and vegetarians, who will be able to enforce a shared
social norm. You could, instead, if you want to try an intellectual tack, try
to show how my personal interests are served by becoming a vegetarian, but you
will have to discover some interests and values that strongly outweigh my
predilection for eating meat.
Or what you _could_ do is abandon your moral arguments, temporarily, until a
day when you're worried about punishing the wayward vegans rather than
converting the meat-eaters. You want to make vegans cool, the social elite.
You want to get into the media and fill as many TV shows and movies and
podcasts and YouTube personalities with _cool_ vegans. You want to make people
_want_ to be like you. While you're at it, make animals more important,
anthropomorphize them as much as possible, try to generate as much empathy for
animals as possible - to some degree, this is already being done. And then in
a few decades, if you keep it up, most people will be vegetarians and you can
then start talking about how killing animals for food is evil, and you can
probably even pass a law to that effect. That'd be a way to get some moral
feeling behind it.
~~~
goldenkey
Its pretty moronic to assume that a feeling every animal with a nervous system
avoids, is somehow not proven to be a deplorable state to put sentient life
into. You can play coy semantics but its quite apparent that even primitive
mammals avoid pain.
You can go ahead and spout off arguments about us being biological primal
carnivores but that isn't very persuasive considering that the defining aspect
of intellect has always been a higher sense of understanding of good and evil,
and manipulating the environment to support the greater good.
And thus, inflicting pain on beings like us, for extra entropy on taste bud
receptors, seems rather low class and savage. Its one thing to kill. Its an
entirely different thing to buy products from a supermarket that come from a
pipeline of some of the worst acts against a conscious entity one could
imagine themselves waking up in. What separates you from those animals today?
Merely luck. When it is luck that decides the outcome of rather uneven
circumstances, people rally against it. Intelligence presupposes that
consciousness is not a choice, so is punishing someone because they sprouted
into the wrong mammal type something that should be deemed okay?
Do onto others as one would want done onto self. We are experiencers first,
animals second, and humans third, lets not deny the absolute tragedy that is
the conscious experience of millions of livestock. Death is not the enemy of
moral virtue but cruel and unusual life surely is.
~~~
dang
This comment breaks the HN guidelines by being uncivil. We've asked you
repeatedly not to do this. Please don't do it again.
~~~
goldenkey
I'm going to ask again, aside from the word "moronic" which I called an
opinion, not a person. What exactly is uncivil? I gave a cogent argument about
the issue at hand. If you have issue with a single word as a moderator, it
would fit to specifically point that out, rather than call my entire post
uncivil. This is clearly a decisive issue with strong opinions.
~~~
dang
"Moronic" is a textbook example of calling names in the way the HN guidelines
ask you not to:
[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html).
That's bad anywhere, and the worst kind of thing to lead with.
"You can go ahead and spout off" and "You can play coy semantics" come across
as personal jabs. Such phrases are a kind of elbowing. You may feel that you
didn't mean them personally, but the person who gets the elbow typically
doesn't agree. This is the path to degraded discourse, so please don't do it
here.
~~~
goldenkey
Okay, thanks for the explanations. I will do my best to avoid these things in
the future. Cheers.
------
brunorsini
“In fact there was only one species on the planet more intelligent than
dolphins, and they spent a lot of their time in behavioral research
laboratories running around inside wheels and conduction frighteningly elegant
and subtle experiments on man. The fact that once again man completely
misinterpreted this relationship was entirely according to these creatures’
plans.”
RIP Douglas Adams.
~~~
6stringmerc
Side note: The Black Plague
~~~
GFK_of_xmaspast
My understanding is Yersinia pestis is just as fatal to rats (and fleas) as it
is to people.
------
justratsinacoat
>Did we ask the animals too? Maybe the pig consensus is different
Most mammals react to painful stimuli (which we can provide a baseline for by
testing it on other humans, whom you luckily consider to be reliable enough)
with aversion and fear. We 'ask' this question of many animals quite
frequently, and have throughout history. The answers are reliable and
consistent.
>If most people agree they do not want to pay taxes, do taxes become evil?
There is no objective reason to consider 'most people agree on x' as some kind
of objective moral imperative. There is no moral quality to caring what people
want. I don't think this is a very sound basis for an objective morality
_starts to construct argument_
_sees sophistry in re 'suffering' downthread_
Equivocating physical and emotional and social 'suffering'? Nope!
As to your edit, your perspective on other people's perspective is familiar
and disturbing. I think you should reexamine your eagerness to throw around
the word 'objective', your penchant for jumping to conclusions regarding your
interlocutors, and your self-described thresholds for othering members of your
social circle. Or, because you're right and you know it, ignore the entirety
of this post.
~~~
Amezarak
> Equivocating physical and emotional and social 'suffering'? Nope!
I was, but people also inflict physical suffering on themselves, from fasting
to literal self-flagellation and mutilation. There have been people who
offered themselves up for literal torture and sacrifice _gladly_.
> and your self-described thresholds for othering members of your social
> circle.
I was describing how people in general work, not 'othering members of my
social circle.' People do not respond positively to being told something they
do or like is evil. I don't think that is a controversial statement.
~~~
justratsinacoat
>I was, but people also inflict physical suffering on themselves, from fasting
to literal self-flagellation and mutilation. There have been people who
offered themselves up for literal torture and sacrifice gladly
This behaviour you describe is usually mediated by some kind of non-biological
impulse -- like the 'certain' 'knowledge' that the Hairy Thunderer/Cosmic
Muffin will reward this suffering in some future-/afterlife. Those impulses
are all related to the pointless social factors that you decry. These people
suffering is still an evil, even if they ultimately inspire something that
outweighs the evil of their suffering (religious revolution, buying
time/forgiveness for their loved ones, etc). Plus, you're still splitting
hairs because it's not like these individuals are feeling pleasure or not
anticipating pain uncomfortably. Next up, BDSM!
>>>You're simply trying to exploit a "hack" in my social animal brain, which
is that if you tell me that something is wrong, you are telling me that other
members of the human herd will judge me unfavorably and perhaps ostracize me
or otherwise punish me. Some people, but not very many, will be swayed by your
attempted brain-exploit. But I, again, observe that my conscious feels guilt-
free, which is a very strong sign that the people I am surrounded by do not,
in fact, disapprove of my eating meat, and also that the vast majority of
people I know actually eat meat. So somewhere in my hindbrain, I realize that
you are (from my perspective) lying to me, and it will make me instinctively
resentful of you, because you have marked yourself as a member of The Other
You typed this, right? Then you typed:
>> and your self-described thresholds for othering members of your social
circle.
>I was describing how people in general work, not 'othering members of my
social circle.' People do not respond positively to being told something they
do or like is evil. I don't think that is a controversial statement
Is this how 'people in general work', or is this you Othering someone? Did
your interlocutor turn out to be utilizing the technique you perceived him to
be?
~~~
Dylan16807
> Is this how 'people in general work'
Yes.
> or is this you Othering someone?
Who would be othered by that statement?
> Did your interlocutor turn out to be utilizing the technique you perceived
> him to be?
That statement is about a fact, not anyone else in the conversation, so it
doesn't matter.
~~~
justratsinacoat
>> Is this how 'people in general work'
>Yes.
Well, that was a quick 'discussion'! Glad you came along! Thanks for the
downvote!
~~~
Dylan16807
> Well, that was a quick 'discussion'! Glad you came along!
You're welcome! The discussion of how there is an impulse for people to get
upset and defensive when told they're wrong, and how it impedes learning, is
very well established. It's not worth wasting time on a weird side argument
about it. The answer is just 'yes, that happens a lot'.
Edit:
> Thanks for the downvote!
Hey, feel free to continue with the rest of the conversation without me. I'm
not going to judge. I'm only going to snip at weird unproductive semi-
insulting tangents.
~~~
justratsinacoat
>The discussion of how there is an impulse for people to get upset and
defensive when told they're wrong, and how it impedes learning, is very well
established. It's not worth wasting time on a weird side argument about it.
The answer is just 'yes, that happens a lot'.
What the hell? I guess that's what I must have been talking about! Thanks for
informing me of what I was talking about! I totally agree that "people to get
upset and defensive when told they're wrong, and how it impedes learning"; I'm
experiencing it right now! Of course, now that you've shaped the discussion
toward your end, I won't even try to have a 'conversation' with you.
>Hey, feel free to continue with the rest of the conversation without me. I'm
not going to judge. I'm only going to snip at weird unproductive semi-
insulting tangents
With unproductive semi-insulting tangents; it's almost like-- ah, sorry,
everything is suddenly illuminated. Carry on.
~~~
Dylan16807
> Thanks for informing me of what I was talking about!
That's what the line you quoted said. If you were replying to something else,
you screwed up quoting. I can't tell you what you meant, but I _can_ tell you
what you quoted and said 'is this true?' about.
I'm not trying to shape the discussion as a whole, I'm trying to say "that one
line? don't do that"
I did reply to an off-topic line with an off-topic line. I wanted to be clear
what I was downvoting, and that it wasn't your opinion.
------
cLeEOGPw
An interesting kind of empathy where the "emphatic" rat instantly tries to
fuck rat it just released.
And the article says "where the rat has nothing to gain". Might want to
reevaluate that.
------
percept
Love it. Go rats.
| {
"pile_set_name": "HackerNews"
} |
Bitcoin Isn’t Tulips, It’s Open Source Money - rbanffy
https://blog.evercoin.com/bitcoin-isnt-tulips-it-s-open-source-money-67e2286ff142
======
aitrean
As somebody who has held cryptocurrency for years and codes Solidity, never
before have I seen a dumber market than this. Every time valid criticisms come
forward, they're met with articles like this. The articles are usually
convincing to people with limited understanding of economics or technology, so
they keep quiet. "Open source money" and "blockchain is the future" sound like
very convincing arguments to justify these ludicrous sums of money. What they
neglect to address is that...
A) Bitcoin is congested with transactions and takes hours to process some of
them
B) Bitcoin fees are through the roof, it sometimes costs as much as 25% of the
value of a transaction to send it
C) The upgrades to fix these problems are extremely slow - scaling hasn't been
solved for years, because different parties keep fighting for control over the
currency codebase and shooting down each others' attempts to upgrade it
D) Bitcoin is still, objectively, centralized. Almost all of the miners are
based in industrial regions like China, and belong to a small monopoly of
mining pools. All of the exchanges are centralized, and are well-known for
losing clients' money. The big banks are replaced by big whales, who will do
anything in their power to manipulate markets for their gain.
E) Bitcoin addresses the technology perspective of sending money, but doesn't
address any of the macroeconomic problems that arise from a deflationary,
uncontrolled currency.
F) Bitcoin is not actually anonymous. Data science techniques can parse the
blockchain and see sending patterns between different accounts. They map these
patterns to various identities. If those identities can be mapped to a real
person, say, when they cash out of an exchange, then it's known exactly how
much Bitcoin they have.
If someone could write an article addressing these facts, rather than throwing
out slogans like 'open source money', that would be nice.
~~~
colordrops
I could go down the list and agree and disagree with some of your comments,
but how does that address the article? Despite being a trite statement, and
despite it's flaws, bitcoin _is_ open source money.
~~~
neilwilson
Bottles of wine are money.
But they are not currency. Currency is what you settle your tax bills in. And
ultimately you need to get hold of some at some point.
~~~
colordrops
Ok, but how does that make the statement wrong? Gimp is open source photo
editing, and many people have criticisms against it compared to photo shop,
but it's still open source photo editing.
~~~
mannykannot
Aitrean pointed out the irrelevance of that fact 25 minutes ago, in response
to your original post.
------
beaconstudios
at no point in this article is a value proposition actually defined. "Open
Source Money" is a nice soundbite, but unless you can define what that
actually means and how it will benefit mankind, it's as useful as "facebook
for cats". I'm in the anti-cryptocurrency camp (despite having read the
satoshi paper and comfortably understanding the algorithms behind bitcoin)
because I've seen it morph into the exact thing it was meant to oppose -
financial speculation and paper-shuffling by rich investors purely for the
sake of profit. It's somewhat ironically hilarious that ICOs are now a thing
given the anti-corporate/anti-banking message of the bitcoin genesis block.
Right now I see no value to cryptocurrencies besides black market transactions
and hacking around with financial instruments in a very closed system. I'm
open to the idea that I'm missing the whole point, but I've not yet seen a
compelling argument for that.
~~~
jaymzcampbell
> I see no value to cryptocurrencies besides black market transactions and
> hacking around with financial instruments
It costs a small fortune to send money internationally not to mention the
friction of needing to have a bank account - almost 40% of the people on the
planet do not have one[1]. Something _like_ Bitcoin et al (not necessarily BTC
itself!), where you can get an address immediately that can send and receive a
balance from anywhere & anyone in the world without impediment has the
potential to be huge. I appreciate I am being a bit starry-eyed here but I can
see a real value to it _globally_ if such a vision can continue.
That's not to condone what might now be happening with institutional investors
and Wall St getting in on the action in a big way, or the state of the tech
(transaction fees at present, verification delays, snake oil ICOs). The core
concept of a currency like this _is_ valuable societally IMO. It might not be
bitcoin in the end but the idea, like the internet and publishing, is a
powerful one.
[1]
[http://www.worldbank.org/en/programs/globalfindex](http://www.worldbank.org/en/programs/globalfindex)
~~~
Nursie
It's pretty cheap to send money internationally in a lot of places, and pretty
quick too.
The oft-touted benefit of BTC is not that new or appealing to the whole world.
In fact a lot of the selling points I see boil down to flaws in the US banking
system, and skewed views of how the rest of the world operates.
~~~
jaymzcampbell
From my own recent experience, I had to spend £10 to send €6, pay conversion
fees and it took 2 days to go through.
[https://estimatefee.com/](https://estimatefee.com/) suggests right now for a
2-hour confirmation a cost of around $3 via BTC. (Natwest in the UK to an
account in the EU).
~~~
Nursie
I didn't pay much more than that to transfer about £20k from Australia to the
UK a couple of years back. I think I paid about £12...
------
mannykannot
"If anyone is dissatisfied with Bitcoin, it’s possible to fork it and add
value. If a developer made Bitcoin 100x faster and just as secure, you can be
sure that everyone would use the faster one."
It is rather ironic that the author should say this at a time when the Bitcoin
community has recently been at war with itself, and is still divided and
without a solution, over a proposal to adopt a fix for the urgent problem of
transaction rate -- so ironic, that I checked the dateline, and it is current.
------
ineedasername
Never quite explains what is meant by open source money. And cryptocurrency
evangelists in this vein alwayd seen to assume money can be decoupled from the
underlying economies it flows through and the individuas and institutions that
make use of it.
Here's a hint though. The statement "if Bitcoin can’t deliver those things,
another cryptocurrency will do that" isn't something anyone wants to hear
about their money. No one wants a massive or sudden devaluation because a
rival currency supports a shinier newer feature. When that happens people
don't just get upset. When that happens, social contracts get broken and we
rediscover why Hobbes wrote his book, and why the hardest currencies usually
belong to countries able to back up their fiat with more than words or market
forces.
So take note, this should be axiomatic to anyone thinking about currencies "at
scale": the demand for a currency will be inversely correlated with its
volatility.
Research the historic role of clipped coins, and consider the impact of
receiving coins that can be clipped after receipt. And dont stop there, thats
the beginning.
------
eqmvii
...that a small number of people printed for themselves and now hold in
massive quantitites.
Wealth distribution is bad enough with the currencies we already had. A true
bitcoin takeoff would be an order of magnitude worse.
~~~
tzakrajs
Thank you for raising this point. Do we really want to make people insanely
rich just because they were stupid enough to convert their money to BTC? It's
frustrating to watch people getting rich making the wrong move. Why do we
think their next move will be anymore brilliant than luckily holding BTC?
------
KaiserPro
Perhaps if the author had researched the tulip episode, the article would be
different.
The massive price rises were in _futures_ not actual prices. There is lots of
academic work around this which is spectaculally interesting.
Second, to say that bitcoin has "intrinsic value" is the same rubbish thats
applied to gold. Value is _always_ subjective.
The noise about competition for "fiat currency" is just that, noise. At best
its equivalent to a bond, worse coffee futures.
Exchanges are perfectly able to do fractional reserve banking, so bitcoin is
very much a fiat currency. Its basically what bitcoin exchanges do
~~~
lifty
I fully subscribe to the subjective theory of value. There are other theories
which might be good approximations in specific scenarios, but fundamentally,
value is subjective. But following that reasoning I believe that if enough
people agree on a new means of value storage or value exchange, critical mass
can be achieved where the new medium is fully adopted while the old one is
abandoned. I think the same thing happened when we transitioned from gold to
fiat, and it happened largely under the direction governments around the
world. As long as we all agree on it, a new medium can be adopted.
Now, will Bitcoin replace fiat? That would be difficult, and I think people
have different perspectives and motives for adopting Bitcoin. I suppose that
is why there are so many heated debates where people don’t agree on the
valuation. Actually, if you look at the world, people have opposing views on
many topics, including on how our economies are run and on how we do monetary
policy. Just accounting for that chunk of people that want a hedge or
alternative to the mainstream system can explain at least a part of the
valuation.
------
mathgenius
We don't just need open source money, what we really need is open source
government.
~~~
jayess
Separation of currency and state.
------
neilwilson
Perhaps not Tulips, but definitely akin to the Somali Shilling.
------
drieddust
I think these arguments are coming from wishful thinking. In my view block
chain as technology is useful and most of the institution will copy it to roll
out their own version soon.
And bitcoin is nowhere near being considered "open source money" because if it
was then we should not be measuring it with respect to USD.
Most of these articles comes from folks who think technology can fix the
social and political issues.
------
JustFinishedBSG
Tulips were open source too.
~~~
baby
But they were not scarse neither hard to mine
------
mannykannot
OK, let's accept, for the sake of argument, that Bitcoin-as-open-source-money
is a valid analogy. How does the history of open-source things disabuse the
notion that what is happening now is a speculative bubble? Where else can you
find an example where the benefits of open source have led to this sort of
explosive growth in valuations?
------
pfarnsworth
Until a material percentage of non-speculative transactions are done via
bitcoin, it's not money. Who in their right mind would spend a bitcoin today
when it could jump in value by 25% in a week? It's purely a speculative
vehicle right now, and won't stop until it stabilizes in price where people
can trust it.
------
neil_s
"My goal in posting this is not to convince the fiat fundamentalists, it’s to
arm believers with an argument that at least makes sense."
Wow. Never thought I'd see religious faith arguments on the front page of HN
------
colorincorrect
Tulips aren't bitcoins, It's tangible beauty!
------
jimjimjim
Buy X says person with X.
------
justherefortart
No, it's a new Pyramid Scheme. Make sure you get your money out first.
------
Ologn
Bitcoin isn't tulips, because tulips can decorate your property. Bitcoins are
absolutely devoid of any value, thus with the limit being zero on the Bitcoin
equation, the ultimate market cap for all Bitcoins will be zero.
Money for the past 10,000 years has just been commodities. Commodities with a
useful value. The commodities which became money were ones which made good
currencies. They were portable, uniform, durable and divisible. Gold is a
commodity which fits these attributes. Eventually people began trading
promissory notes redeemable for gold.
This was the situation until 1971. Then Nixon closed the exchange window and
said the Treasury Department was not exchanging the dollars for the 8000 tons
of gold it stores in Fort Knox and elsewhere. Almost nothing happened to the
dollar in the days after - the dollar price of bread and the like stayed
stable. Did some great transformation happen? No. If the market had paniced,
the gold window would have been opened again. No magic spell was cast in 1971.
The 8000 tons of gold in Fort Knox etc. still back the dollar implicitly.
If not - why does the government spend so much money guarding Fort Knox etc.?
Why is it hoarding all this gold, if it does not back the dollar?
Bitcoin does not have 8000 tons of gold implicitly backing it. It has nothing
backing it.
There is a little question mark that exists. Obviously Fort Knox's gold backs
the dollar implicitly. But for some politicians and such, it is beneficial if
they wave their hands and say some magic spell has made the dollar valuable
without that gold. As if they had a magic printing press that could just run
off sheets of hundred dollar bills 24/7 that had value. This is not the case,
and is untrue, but creates a question mark from this little fib. This question
mark is where Bitcoin draws its suckers in. If the government has a magic
hundred dollar bill machine, maybe they can make one as well. The problem is
the government has no such machine, they just have some politicians and such
who claim they do. What the government does have, among other things, is 8000
tons of gold which it would exchange for dollars in 1971, and could easily do
again. Bitcoin has no such backing. In fact, we don't even know who started
Bitcoin. At least with Charles Ponzi's international reply coupons, you knew
the name of who was behind the scam. With Bitcoin, the scammers are anonymous.
I just read a Times article about the probable insolvency of Bitfinex, the
umpteenth Bitcoin company with such problems. Charlie Munger says Bitcoins are
rat poison, and he has surely seen many such scams in his 93 years.
Also, reading this blog post with its railing against the government and the
establishment reminds me of the pitches for some MLM scams I have been dragged
to. When someone tells you they're giving a run for its money the
"government's fiat monopoly on money", watch your wallet.
------
pvaldes
Yeah, those silly, silly dutch, still selling millions of tulips since 1637,
employing more than 250.000 people in the flower market and exporting plants
for a value of around 10 billion euro each year...
Maybe not a very good example of an economic failure
~~~
ceejayoz
That's a silly argument. The real estate crisis was still a _crisis_ despite
the fact that real estate is still successfully sold afterwards.
~~~
pvaldes
And the question to answer here is: Can the bitcoin survive to a big crisis
and be still sucessful when prices fall and stabilise?
European aquaculture for example has suffered several similar (aptly named)
bubbles and later recovered from them. The history of tulip's trade and market
is also far from being finished yet. Nobody measures aviation success anymore
based on Hindenburg event.
In the long term many strange markets manage to survive even if the actors
change
| {
"pile_set_name": "HackerNews"
} |
Scala.js contribution disclaimer - palerdot
https://github.com/scala-js/scala-js/blob/master/DEVELOPING.md
======
sheetjs
We have something similar: [https://github.com/SheetJS/js-
xlsx/blob/master/CONTRIBUTING....](https://github.com/SheetJS/js-
xlsx/blob/master/CONTRIBUTING.md)
In our case, the root problem is twofold: developers in other related projects
have been less-than-circumspect in code contributions, and thanks to the
Microsoft Shared Source Agreements ([https://www.microsoft.com/en-
us/sharedsource/default.aspx](https://www.microsoft.com/en-
us/sharedsource/default.aspx)) there are many people who have looked at
protected code and signed agreements.
It is probably even more relevant for Scala.js since they are dealing with the
JDK and Oracle has not been afraid to pursue potential violations.
------
th3iedkid
But why exclude even looking at those ?
~~~
eshyong
Probably to avoid a similar situation to Oracle vs. Google.
------
painful
"See no evil"!
------
porphyrogene
If something is open source and protected under copyright does that make it
“clopen source”?
~~~
rpdillon
Open source licenses derive their power from copyright law, so everything that
is open source is protected by copyright.
| {
"pile_set_name": "HackerNews"
} |
Turkey is sliding into dictatorship - sukruh
http://www.economist.com/news/leaders/21720590-recep-tayyip-erdogan-carrying-out-harshest-crackdown-decades-west-must-not-abandon
======
bpodgursky
I've read the Economist for a long time, and they should be ashamed of
themselves for not calling out how wrong they were when Erdogan was first
elected.
Their line was that the AK's "moderate Islamism" would be good for democracy
long-term, and anyone who was afraid of political Islam was regressive.
Yeah, sometimes editors and opinion pieces are wrong. But admit it when you
are -- I haven't seen the Economist staff learn anything from this sad
episode.
~~~
enraged_camel
Back when he was first elected in 2004, Erdogan was not the huge asshole that
he is today. His party actually conducted a lot of useful reforms that were
beneficial for the country.
Over time though he slid into authoritarianism, and started jailing or exiling
people who disagreed with him. Some people believe this was him showing his
true colors, while others think it's a combination of his narcissism and
severe paranoia getting the better of him.
Don't get me wrong: Turkey was never the pinnacle of freedom and the fair
treatment of minorities. But, for people who care about liberal democracy and
the separation of religion and state, things are inarguably in a much worse
shape today than they were a decade ago.
~~~
hackuser
> Back when he was first elected in 2004, Erdogan was not the huge asshole
> that he is today. His party actually conducted a lot of useful reforms that
> were beneficial for the country. / Over time though he slid into
> authoritarianism
That is the standard practice of authoritarians. Rarely do they announce
before being elected, 'if you elect me, I'm going to do away with democracy
and become a dictator'. First they get elected, then consolidate power
(including by governing in a way that builds political support), and then they
seize power.
The fact that he behaved in the way the parent describes doesn't mean he
didn't plan to seize power as an authoritarian dictator (and it doesn't mean
that he did). If the future authoritarians were easily identified ahead of
time, they wouldn't have much chance of advancing their plans.
~~~
true_religion
Because they're not easily identified no one should feel like they have to
apologise for not guessing their nature.
~~~
grappler
If the same thing (god forbid) should happen with Trump, it will be pretty
hard to claim that he wasn't showing authoritarian colors in his 2016
campaign. He definitely was.
------
TillE
The only thing about this which surprises me is how Erdogan seemed to really
want Turkey to become a member of the EU, and yet has been actively doing
everything he can to make that impossible.
Then again, he tends to say and do a lot of odd things, so I guess the simple
answer is to not assume any kind of competence or rational thinking from
political leaders.
~~~
jrez22
Erdogan used EU to get rid of the military check on Islamist trying to
dismantle democracy. Eu is stupid. Every modern Turk warned them and no one
listened. No the EU will reap what its sown.
~~~
aanm1988
> Erdogan used EU to get rid of the military check
How did the EU do that?
~~~
sampo
> How did the EU do that?
_“Since 1999, civilian control of the military has been strengthened. The
constitutional and legal framework has been amended to clarify the position of
the armed forces versus the civilian authorities,” the European Union said in
its 2004 Progress Report on Turkey, which cited various developments with
regard to civil-military ties. “A number of changes have been introduced over
the last year to strengthen civilian control of the military with a view to
aligning it with practice in EU member states.”_
[https://en.wikipedia.org/wiki/Civil%E2%80%93military_relatio...](https://en.wikipedia.org/wiki/Civil%E2%80%93military_relations_during_the_Recep_Tayyip_Erdo%C4%9Fan_government)
It is kind of funny that EU being so afraid of the military occasionally
taking control of the civil government, EU paved way for the much more
sinister threat of a transition to dictatorship. Very shortsighted.
Europeans like to blame America for meddling with foreign countries without
any understanding of the consequences, but here EU did exactly the same.
------
treehau5
This is even more reason to contribute to activists trying to stop the
conversion of Hagia Sophia into a Mosque. It is a piece of 6th century history
that we as a civilization just cannot afford to lose.
~~~
tptacek
Yes, definitely a good reason to re-litigate historical events from 1451.
~~~
treehau5
Considering the Greeks were not liberated from the Ottoman yoke until 1829,
with genocides and killings from "modern" Turkey not stopping until the early
1910's, I'd say it's not exactly ancient history, either.
To put it into context, "Greek Independence Day" comes _after_ American
Independence Day.
~~~
return0
Turkification as a general policy continues to this day and probably will
continue. I don't see how it 's related to Ayia Sofia though, when the temple
was built people still called themselves roman.
------
gkya
So saddening the amount of ignorance and hate I see here in HN regarding my
country. I wish nothing about Turkey gets submitted again here so that I can
continue to visit HN as a hackers' and tech entrepreneurs' forum.
You clearly don't know anything about our history. You don't even know who we
are. You insult us with your superficial readings of superficial wikipedia
articles. We don't need this at all.
We're having hard times, but I still have hope. After all, civilisation was
invented here. I wish we were able to export the better parts too.
~~~
mediaman
Complaining that others don't know as much as an expert (you) isn't going to
help change anyone's perspective. Most people's perspective about most
subjects is superficial. The same can be said for you: you know very little
about most people in most countries.
If you want perspective changed, tell people what's wrong, or the real story,
without hating on the people who could benefit from your expertise. Then your
audience will know more. I think most people here would appreciate hearing
your perspective (with much deeper knowledge of culture and history) on the
current events.
~~~
gkya
I don't really hate anybody. I don't believe in nations at all, if this means
something to you. If it was misjudgements to be refuted, I'd go for it, even
though I'm kind-of short on time. But it's mostly prejudice and hate-for-the-
sake-of-it. I know that's rowing againt the flow, it's tiresome and gets you
nowhere. (Edit: my actual point is that while I do know that there are many
here that would actually appreciate my insights, as I'm a rather neutral and
realist insider, knowing what the bulk of the responses I'll receive in return
of such effort will be like, I just can't convince me to do that. Maybe I'm
not modest enough to try and explain the reality in the face of belittling
dismissals and ostinated misunderstandings.)
------
acchow
The West must not abandon Turkey
Huh? So are we supposed to meddle in other countries' politics and government?
Or are we not?
~~~
kislakiruben
Aren't we already doing this? Or are you being sarcastic?
~~~
acchow
My point was we get so much flak for doing this.
But then sometimes we're "supposed" to do it? Which is it?
------
c3534l
The problem is the vote has likely already been counted and any voting is just
for an aura of legitimacy.
~~~
gkya
We take votes very serious here and there are multiple NGOs that are on top of
things all around the country. I've once worked with CHP on an election day at
a local office and can say that it's nearly impossible to do tricks in any
elections.
------
rdmsr
All you need for constitutional reform is a 50% majority on a referendum. That
seems way too easy.
~~~
untog
A timely warning to the US, too - as more and more Senate votes are being
decided by 50% majority rather than their traditional 2/3 (EDIT: sorry, 3/5).
That both parties indulge in it should be worrying to everyone.
~~~
int_19h
With Senate, the simple 50% majority is by design. That's consistent with most
other bicameral parliaments out there, which also vote on a simple majority
basis. Since the issues they can vote on are restricted by the Constitution,
it's not really a big deal. The big deal is amending the Constitution itself,
which is extremely difficult (if anything, probably more difficult than it
should be).
~~~
hackuser
> Since the issues they can vote on are restricted by the Constitution, it's
> not really a big deal.
I don't agree. The U.S. Senate is very powerful; those restrictions aren't
very broad.
> the simple 50% majority is by design
It depends what you mean. The filibuster, which requires 60% majority, has
been part of the Senate since around the 1840s; clearly many generations of
Senators thought it was important and intended that it continue.
~~~
int_19h
The filibuster was always removable by a simple majority, though, so it was a
self-applied restriction, not an external one. Which limited its efficiency
greatly - anyone relying on filibuster knew that if they pushed too hard, it
would go away.
~~~
hackuser
> The filibuster was ...
It is ... it's still there, just not for confirmations of appointments by the
executive branch.
------
gefh
Sliding? More like rushing full throttle. Ataturk would be disgusted.
~~~
upquark
What makes reddit and now HN think Ataturk was some sort of beacon of
democracy?
> In January 1920, Mustafa Kemal advanced his troops into Marash where the
> Battle of Marash ensued against the French Armenian Legion. The battle
> resulted in a Turkish victory alongside the massacres of 5,000–12,000
> Armenians spelling the end of the remaining Armenian population in the
> region.
He finished off the Cilician Armenians from Anatolia, whoever was left under
French protection and had escaped the Genocide.
~~~
jrez22
It was war not genocide
~~~
muse900
It was a pure genocide!
~~~
jrez22
also THE ARMENIAN ALLEGATİONS AND FACTS
~~~
dalys
Which funnily enough returns this as the first result
[https://en.wikipedia.org/wiki/Armenian_Genocide_denial](https://en.wikipedia.org/wiki/Armenian_Genocide_denial)
------
peregrix1
Yeah, Turkey is sliding into dictatorship but Egypt is doing very well.
[http://www.economist.com/news/leaders/21709955-belatedly-
and...](http://www.economist.com/news/leaders/21709955-belatedly-and-under-
pressure-abdel-fattah-al-sisi-has-done-some-hard-necessary-things-two)
[http://www.economist.com/news/middle-east-and-
africa/2170997...](http://www.economist.com/news/middle-east-and-
africa/21709971-abdel-fattah-al-sisis-reforms-will-make-him-unpopular-can-he-
stand-it-sense-and)
Erdogan was "moderate" when "Western Democracies" was able to call a coup a
coup and Syria wasn't a testing bed for weapons.
It looks they need a smokescreen for the blunder in Europe, Syria and ME in
general and with his figure Erdo-guy is the best candidate. Oh "The Sultan",
"The Barbar Turk is at the gates", "figthing with the poor Kurds", "journalist
jailer", "coup was a theater" etc.
Did you really learn anything from the article about what changes in the
constitution and what doesn't? According to the narrative Turkey was a
"dictatorship" a year ago too, is a close "Yes" (or "No") in a referendum a
typical political behavior of dictatorial politics, right?
Sorry. When the West reacts like this (and doesn't react to real dictators
unless they piss to their lawn), it more and more looks like Western
intellectual's main mission is to create a narrative to open a path for
exploitation/military politics. God may have died but "Deus Vult" still lives.
------
nomercy400
Now I wonder, what will the NATO do when Turkey becomes a dictatorship.
~~~
pmyteh
Almost certainly nothing. NATO has had plenty of dictatorships as members,
including Turkey during its various military takeovers.
------
bobosha
sliding?
------
soared
This website is disgusting. I have to close four dialog boxes and the content
takes is ~30% of my screen. I expected better from the economist.
~~~
return0
"Let's make everything fixed, it will look SO good on small displays"
| {
"pile_set_name": "HackerNews"
} |
Spinner - hakanito
https://www.google.com/search?q=spinner&hl=en
======
busted
Similarly,
[https://www.google.com/search?q=roll%20dice](https://www.google.com/search?q=roll%20dice)
[https://www.google.com/search?q=random%20number%20between%20...](https://www.google.com/search?q=random%20number%20between%205%20and%2055)
Probably fairly easy for engineers to add these
~~~
jhalstead
I like 'solitaire' personally.
[https://www.google.com/search?q=solitaire](https://www.google.com/search?q=solitaire)
~~~
bartread
Hmm, when did this appear? That's got to be pretty irritating if you run a
solitaire site to earn a living from the ad revenue. (I have a friend who does
this - not the World of Solitaire guy, although fair play to him because he
has almost every variant known to humankind on there.)
~~~
jgalt212
Like Windows subsuming all useful third party utilities into Windows itself?
If Google really went all in on these sort of search based functions, it could
provide an angle of attack for DOJ/EU.
~~~
bartread
Well, yeah, that's sort of the line I'm thinking along.
90% it's engineers doing something they think is cool/kooky/funky without
thinking about the consequences. As in, you know those top two sites have
taken a hit on revenue because someone on a fat salary at Google wanted to do
something cool with their 20% time, or some similar story.
But there's always this 10% suspicion, which is perhaps amplified by the
controversy around AMP - and maybe it's a little bit unfair - that they
ultimately never want you to leave the Google site.
That may be the case but, even if it's not, they do this stuff that they think
is cool, but they do it without any sense of responsibility or empathy.
That solitaire game is a fairly limited implementation - admittedly rather a
nice one - and it lacks a lot of the variants, and other bells and whistles of
the top solitaire sites. Nevertheless it's going to be up there for all time,
stealing traffic from and reducing the revenue to those other sites, and it
won't make a jot of difference either way to Google.
I don't really think that's OK.
~~~
Sembiance
I run World of Solitaire, currently #1 Google result for the solitaire search
term. I haven't noticed any measurable change in traffic to my site since they
put this up. Probably because folks can get a much better experience on my
site compared to the mini version they created. Maybe simpler website with
tools such as spinners, randomizers, etc may see more of an impact.
~~~
bartread
Well that's good to know at any rate, and I imagine people who've already used
your site will go back to it rather than play Google's simple version, but I
wonder if longer-term it'll mean a drop in organic traffic? Time will tell, I
suppose.
------
khazhou
Lemme get this straight: they're already collecting all my personal data and
browsing history, and now they want my RANDOM NUMBERS too?? No way. #myrandint
~~~
themodelplumber
Just add a 1 to the end of any number you generate with their servers. That
should delay the government long enough that you can escape with the Zip disk.
(You know the one)
------
syphilis2
I like these things. DuckDuckGo has them as "instant answers" and you're
encouraged to add new ones:
[https://duck.co/ia](https://duck.co/ia)
[https://duckduckgo.com/?q=roll+seven+dice](https://duckduckgo.com/?q=roll+seven+dice)
[https://duckduckgo.com/?q=random+person](https://duckduckgo.com/?q=random+person)
~~~
edgeorge92
[https://duckduckgo.com/?q=roll+nine+hundred+thousand+dice](https://duckduckgo.com/?q=roll+nine+hundred+thousand+dice)
Still waiting on my answer :(
~~~
JonnieCache
Really? I got 3148825 back within about two seconds.
------
motoboi
What am I supposed to see? It just shows a dictionary entry for spinner and
normal results.
~~~
rfugger
It seems to only work in Chrome. Firefox mobile shows nothing unusual.
~~~
JonnieCache
Works fine on firefox desktop for me. Its some kind of locale issue I think,
as others have suggested.
------
carsongross
The urban dictionary result (#1 for me, I hope this doesn't say anything about
me) provides an excellent counter-point to the charm of this tool.
Ah, the internet.
------
qeternity
I am so curious about how these relatively simple easter eggs work behind the
scenes. How are they incorporated into the core search repo? Surely the guys
in charge of that beast aren't allowing these things anywhere near the core
code of the GOOG empire.
~~~
petters
Everything is one repo at Google. So yes, it shares repo with search. :-)
There are many, many servers involved to display a results page. These things
probably are an RPC to a dedicated server. Haven't checked though.
------
bodiam
Slightly more impressive:
[https://www.google.com/search?q=play+dreidel](https://www.google.com/search?q=play+dreidel)
~~~
maverick_iceman
[https://www.google.com/search?q=play+tic+tac+toe](https://www.google.com/search?q=play+tic+tac+toe)
~~~
phillc73
Sadly this doesn't work:
[https://www.google.com.au/search?hl=en&q=play+noughts+and+cr...](https://www.google.com.au/search?hl=en&q=play+noughts+and+crosses)
------
fredkingham
I'm not in general a bing fan, but this is super useful
[http://www.bing.com/search?q=speed+test&qs=n&form=QBLH&sp=-1...](http://www.bing.com/search?q=speed+test&qs=n&form=QBLH&sp=-1&pq=speed+test&sc=8-6&sk=&cvid=5DF22B0882A9419BA0D3F3E28E0B08D9)
~~~
baddox
It works on Google as well.
~~~
gutnor
note: only on google.com ! Not on google.co.uk
------
lillesvin
I usually go for:
[https://www.google.com/search?q=askew](https://www.google.com/search?q=askew)
------
sengork
Google should figure out a way to better expose the features of their web
services. Most often we seem to find out about a Google feature by chance
alone.
~~~
wybiral
For little gags like this I'd say that's part of the charm. But for actual
functionality, I agree that discoverability is limited.
~~~
tritosomal
Hey man, c'mon. Your supposed to _search_ for things with Google.
That's like, what their specialty is.
------
maverick_iceman
[https://www.google.com/webhp?sourceid=chrome-
instant&ion=1&e...](https://www.google.com/webhp?sourceid=chrome-
instant&ion=1&espv=2&ie=UTF-8#q=sin\(x%2By\))
------
edgarvm
Quick fix for the link
[https://www.google.com/search?q=spinner&hl=en](https://www.google.com/search?q=spinner&hl=en)
------
jrosenbluth
Shameless plug:
[http://jeffreyrosenbluth.github.io/lottery/](http://jeffreyrosenbluth.github.io/lottery/)
------
Semiapies
The links people have dropped here are a hoot...but the first thing I thought
of from the headline was the old Spinner music service that got bought out by
Netscape, ages ago.
------
impish19
Anyone else bothered by how the wheel doesn't stop when you tap on it when
it's spinning?
~~~
mbreese
No, because that would be cheating. Once you start the spin (you can pull it
back to get a good start!), you can't just stop it where ever you'd like.
"Wheel of fortune" rules apply here.
------
buckbova
Origin story?
~~~
greglindahl
Once upon a time, marketing realized that free publicity would happen if
Easter eggs were added to the product.
~~~
Tempest1981
Including posting the findings on YouTube:
https://www.google.com/search?q=google+easter+eggs&tbm=vid
"about 1,700,000 results"
------
csours
is there a google search that finds a list of search terms that show special
pages?
------
pawelmi
There should be some switch for proper zero-based numbering :)
------
donmb
Is this open source? I'd like to add this to a project :)
------
bitmapbrother
This is cool. There's also:
Flip a coin
Roll a dice
Play Pacman
~~~
notatoad
Do a barrel roll
| {
"pile_set_name": "HackerNews"
} |
Data scientists are helping to flatten the pandemic curve - doener
https://venturebeat.com/2020/03/20/ai-weekly-how-data-scientists-are-helping-to-flatten-the-pandemic-curve/
======
sgt101
An alternative perspective to this self regarding mindless nonsense - written
by someone worthy of respect :
[https://twitter.com/CT_Bergstrom/status/1241549349999411201](https://twitter.com/CT_Bergstrom/status/1241549349999411201)
| {
"pile_set_name": "HackerNews"
} |
On Drafting an Engineering Strategy - brlnwest
https://www.paperplanes.de/2020/1/31/on-drafting-an-engineering-strategy.html
======
pm90
This is s pretty great article by a seemingly thoughtful individual. I hope
more technology leaders read it and understand if not embrace the ideas it
promotes.
Too often, I've been in the situation where a new leader will step into a role
and immediately start throwing out commandments with little context as it
relates to the business. An Ivory tower is built with the help of minions,
assigned unwillingly or willingly bought in to the Grand Plan. Not seeing the
immediate business value of the project, most teams avoid it or do the minimum
work necessary to satisfy the requirements. Literally every time this happens,
after about a year the Ivory tower remains up but is depopulated, a forgotten
relic of a distant past that nobody really wants to think about. The leaders
move on to the next role, and when new engineers ask senior engineers why the
Ivory tower exists they smile and say: "buy me a beer; t'is a long tale"
------
greatgib
Omg, how can this guy not notice that he became an useless director as in
'bullshit jobs'? Here you see the inception of the stupid things you see in
big companies: like determining useless 'objectives' sentences coming from
nowhere.
Look at this for example: "Migrate core functionality pieces (including
authentication, user data, shopping cart) to a micro services architecture
that’s running independently of our main application"
To me it looks like the kind of stupid generic tehnical decisions that are
imposed on teams. We have all lived that: some useless director that has no
idea of the real technical stack or constraints of the system wants to feel
useful and define a 'bullshit' strategy, so what it does is imposing the last
trendy thing he heard at a conference without really knowing what it means.
~~~
nine_k
Have you tried developing several applications which should work independently
but share the aurh /profile info? Add a zero-downtime deployment requirement,
and microservices start to look damn appealing. No, they are not a silver
bullet, but other architectures are much worse in a situation like that.
~~~
greatgib
To be clear, this example was not about the content. It can be a 'good' or a
'bad' idea. Just about the fact to come and say: I will do a strategy for my
poor stupid software engineers because I'm a director. 'Do buzzword, buzzword,
trendy generic statement, buzzword...'
~~~
Ygor
What is the alternative? Assuming here there is such a thing as a correct
engineering strategy, who should define it?
Or, maybe there should be no engineering strategy, at least not an explicit
one written down?
~~~
dchuk
A strategy shouldn’t dictate WHAT needs to be done, it should layout where the
company needs to get to and achieve. So in this micro services example, it’d
be more appropriate to have something like “increase team independence and
throughput”, of which microservices is a _possible_ solution amongst many
others.
In terms of roadmapping, I like to use the phrase “futures, not features” to
remind folks of the proper content for them.
~~~
bluGill
Sometimes a WHAT is important enough to because the strategy. It should be an
architectural problem holding everything back though. For the most part
strategy should sit back a level where lots of different whats all fit in, and
sometimes a what is allowed to violate it for good reasons.
| {
"pile_set_name": "HackerNews"
} |
Show HN: Ddgr – search DuckDuckGo from your terminal - apjana
https://github.com/jarun/ddgr
======
apjana
ddgr is a cmdline utility to search DuckDuckGo from the terminal. While
googler is extremely popular among cmdline users, in many forums the need of a
similar utility for privacy-aware DuckDuckGo came up. DuckDuckGo Bangs are
super-cool too! So here's ddgr for you!
Unlike the web interface, you can specify the number of search results you
would like to see per page; more convenient than skimming through 30-odd
search results per page. The default interface is carefully designed to use
minimum space without sacrificing readability.
A big advantage of ddgr over googler is DuckDuckGo works over the Tor network.
ddgr is available in the repos of many distros as well as on PyPI.
~~~
I_complete_me
Great. Well done on this. I now have a new favourite dictionary on my zsh CLI
thus: alias -g define='BROWSER=w3m ddgr \\!wordnik ' Thank you!
~~~
apjana
Thanks for sharing this.
------
msla
Reminds me of surfraw:
[https://gitlab.com/surfraw/Surfraw](https://gitlab.com/surfraw/Surfraw)
[https://wiki.archlinux.org/index.php/Surfraw](https://wiki.archlinux.org/index.php/Surfraw)
[https://en.wikipedia.org/wiki/Surfraw](https://en.wikipedia.org/wiki/Surfraw)
> Surfraw (Shell Users Revolutionary Front Rage Against the Web) is a free
> public domain POSIX-compliant (i.e. meant for Linux, FreeBSD etc.) command-
> line shell program for interfacing with a number of web-based search
> engines.[1] It was created in July 2000 by Julian Assange[2] and is licensed
> in the public domain[3] and written in the Bourne shell language.
Yes, it's apparently _that_ Julian Assange.
~~~
apjana
The number of dependencies is very high. From Ubuntu 18.04:
$ sudo apt install surfraw
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following additional packages will be installed:
libalgorithm-c3-perl libb-hooks-endofscope-perl libcache-perl libclass-accessor-chained-perl
libclass-c3-perl libclass-c3-xs-perl libclass-data-inheritable-perl libclass-errorhandler-perl
libclass-inspector-perl libclass-singleton-perl libdata-optlist-perl libdata-page-perl
libdatetime-format-mail-perl libdatetime-format-w3cdtf-perl libdatetime-locale-perl libdatetime-perl
libdatetime-timezone-perl libdevel-caller-perl libdevel-lexalias-perl libdevel-stacktrace-perl
libeval-closure-perl libexception-class-perl libfeed-find-perl libfile-nfslock-perl
libfile-sharedir-perl libheap-perl liblwp-authen-wsse-perl libmodule-implementation-perl
libmodule-pluggable-perl libmro-compat-perl libnamespace-autoclean-perl libnamespace-clean-perl
libpackage-stash-perl libpackage-stash-xs-perl libpadwalker-perl libparams-util-perl
libparams-validate-perl libparams-validationcompiler-perl libreadonly-perl libref-util-perl
libref-util-xs-perl libspecio-perl libsub-exporter-perl libsub-identify-perl libsub-install-perl
liburi-fetch-perl liburi-template-perl libvariable-magic-perl libwww-opensearch-perl libxml-atom-perl
libxml-feed-perl libxml-libxslt-perl libxml-rss-perl libxml-xpath-perl surfraw-extra
Suggested packages:
libtest-fatal-perl screen
The following NEW packages will be installed:
libalgorithm-c3-perl libb-hooks-endofscope-perl libcache-perl libclass-accessor-chained-perl
libclass-c3-perl libclass-c3-xs-perl libclass-data-inheritable-perl libclass-errorhandler-perl
libclass-inspector-perl libclass-singleton-perl libdata-optlist-perl libdata-page-perl
libdatetime-format-mail-perl libdatetime-format-w3cdtf-perl libdatetime-locale-perl libdatetime-perl
libdatetime-timezone-perl libdevel-caller-perl libdevel-lexalias-perl libdevel-stacktrace-perl
libeval-closure-perl libexception-class-perl libfeed-find-perl libfile-nfslock-perl
libfile-sharedir-perl libheap-perl liblwp-authen-wsse-perl libmodule-implementation-perl
libmodule-pluggable-perl libmro-compat-perl libnamespace-autoclean-perl libnamespace-clean-perl
libpackage-stash-perl libpackage-stash-xs-perl libpadwalker-perl libparams-util-perl
libparams-validate-perl libparams-validationcompiler-perl libreadonly-perl libref-util-perl
libref-util-xs-perl libspecio-perl libsub-exporter-perl libsub-identify-perl libsub-install-perl
liburi-fetch-perl liburi-template-perl libvariable-magic-perl libwww-opensearch-perl libxml-atom-perl
libxml-feed-perl libxml-libxslt-perl libxml-rss-perl libxml-xpath-perl surfraw surfraw-extra
0 upgraded, 56 newly installed, 0 to remove and 0 not upgraded.
Need to get 3,883 kB of archives.
After this operation, 29.0 MB of additional disk space will be used.
Do you want to continue? [Y/n] n
Abort.
~~~
darkpuma
Seems like your average Node package. CPAN was ahead of it's time.
------
theophrastus
Are the bang searches supposed to _always_ open a gui browser? Because they
seem that way. Search Hacker News for "ddgr" and it opens up (in my case) a
tab on firefox:
ddgr \!hn ddgr
~~~
TheGrassyKnoll
With bangs, I'm getting a new tab in Firefox as well.
~~~
apjana
What's the output of `echo $BROWSER`?
~~~
theophrastus
In my case BROWSER isn't set. And yet ddgr still found my firefox in another
workspace and opened a tab on it. I was just wondering if there's a means to
force a terminal/text output from something akin to:
ddgr \!ety glaucous
~~~
apjana
This is inherited from Python's webbrowser module. It would find a GUI browser
by default. Please set `BROWSER` to your preferred text-based browser. To
force use the GUI browser, use the prompt key `O` or the progrma option
`--gb`.
As you can see, you can only have it one way (either text or GUI by default)
and we kept it the default one.
~~~
dontbenebby
Hi, maybe there's something wrong with my environment, but I installed Lynx
via brew, got the location (which brew|pbcopy) then exported the location to
an environment variable (export BROWSER=/usr/local/bin/lynx).
But when I select something in ddgr, it says it's opened lynx but the display
still just shows a list of 10 results.
If I hit q, it drops me back to ddgr, and then a quit again exits the program.
I don't want to jump to saying the issue is ddgr but was a little surprised.
It's still very useful since a lot of what I do needs images anyways, and it's
IMHO silly to try to do GUI browsing in terminal, just thought I'd mention it.
Thanks for being so active in the comments!
(Trying to be extra positive since I know HN can be a little hypercritical,
and the people who loved it aren't necessarily going to drop in and say so
beyond an upvote :) )
~~~
apjana
Works fine for me. Steps right out of the terminal:
1. sudo apt install lynx
2. export BROWSER=lynx
3. ddgr hello world
4. Press 1 (to open the first result in lynx)
The site is opened in lynx as expected.
~~~
dontbenebby
Thanks, that was my issue (I was typing the literal path to lynx not just
"lynx")
------
vaer-k
Can you search by any duration, like googler can, or does duckduckgo prevent
you from implementing that? That's my biggest complaint about ddg: can't even
search by past year
~~~
jolmg
I see a dropdown on a search results page that says "Any Time". On clicking
it, you get the options "Past Day", "Past Week", "Past Month".
~~~
vaer-k
Yes, but that's it. No "past year" or custom ranges.
------
bootcat
Something related, I made a DuckDuckGo Images API here -
[https://github.com/deepanprabhu/duckduckgo-images-
api](https://github.com/deepanprabhu/duckduckgo-images-api).
This gets image search results and can be embedded into other projects !
------
jaequery
What’s so good about DDG? I’ve used it a few times but i still don’t get the
appeal to it
~~~
cyphar
The no tracking policy, and the large amount of built-in features (such as
bang-searches). They also don't make life hard for Tor users with endless
captcha (there's even an onion service for DDG).
~~~
jaequery
not sure if that still appeals to me.
~~~
dontbenebby
IMHO the bang commands alone make it really useful
[https://duckduckgo.com/bang?q=](https://duckduckgo.com/bang?q=)
------
darkpuma
Is there a way to search for images or videos, e.g. _& iax=images&ia=images_ ?
I looked through the documentation and don't seem to see any.
~~~
apjana
Currently only text search is supported.
~~~
darkpuma
Bummer, I can think of some harmonious interaction between ddgr video search
and youtube-dl. Maybe somebody can put together a PR.
~~~
apjana
You have the option `--url-handler` so the video can be fed to your favorite
media player.
~~~
darkpuma
I'm thinking about invoking from an mpv script with --json.
~~~
apjana
There are some examples in googler's introduction section. Check those out!
------
dontbenebby
Is there a way to supress the search results if I only want an instant answer?
(Ex: If I type in "ip" merely return my IP, not a long list of whois DBs)
~~~
dredmorbius
Instant results are supported:
[https://github.com/jarun/ddgr](https://github.com/jarun/ddgr)
There are a number of "instant" bangs, though it's not clear if these deliver
what you want:
[https://duckduckgo.com/bang?q=instant](https://duckduckgo.com/bang?q=instant)
You can incorporate bang int commandlinee search, though bangs ("!") generally
require quoting to avoid bash / shell interpretation.
~~~
dontbenebby
Ah cool, thanks. Very useful!
I meant more like the various cheatsheets.
(Ex:
[https://duckduckgo.com/?q=vim+cheat+sheet](https://duckduckgo.com/?q=vim+cheat+sheet))
That query would pull up search results, but not the cheat sheet itself. Or if
I do "roll 3 dice" I get unicode dice, but not the numerical sum like on the
web version.
Anyways, this is really a minor gripe - this is a really cool tool and I
appreciate the time you've put into it.
I can see myself using it a lot in the future! :)
------
favadi
Why would I use this if I will need a proper browser to view the result
anyway?
~~~
apjana
Strictly speaking, you don't need a browser. You can directly print the
content of the results in the terminal with the newspaper library.
Coming to the question - the advantage `ddgr` provides is much more
flexibility to the DDG interface or in other words, easy access to parameters
you may want to tweak. For example, show only 10 results per page. Please
check the program options for more.
One can argue, why not use a text browser? Yes, if scrolling around, pressing
tabs to reach a link/field and pressing Enter on it to open the URL/enter text
doesn't bother a user (s)he doesn't need `ddgr`.
------
faissaloo
I find the interface really clunky to use because I need to press enter after
every command.
~~~
smhenderson
I can't tell if you're being sarcastic or not; doesn't that describe just
about everything you do interactively on a command line?
~~~
faissaloo
Yes but the difference is most other programs that employ this interface are
programs I would probably spend alot more time using.
| {
"pile_set_name": "HackerNews"
} |
“Medium” just doesn't work - mastro35
https://medium.com/@mastro35/medium-just-doesnt-work-680d9d180d16
======
enkiv2
Well, it's clear why this author isn't getting paid for his medium posts: he's
writing low-effort low-quality material desperately in need of editing.
Medium doesn't present itself as a substitute for a full-time job. It presents
itself as a mechanism by which people can get paid a non-zero amount of money
for the kind of material they would be writing for free anyway, based on
metrics related to actual quality (such as read time & explicit
recommendation) rather than purely based on views (as with advertising).
I use medium and run sites that have ads. I make about twenty dollars a month
on medium, and nearly all of that comes from material I wrote years ago; in
comparison, I have been using adsense since 2006 and have made a total of nine
dollars from it (across several blogs, a once-popular discussion forum, and a
youtube channel with a couple monetized videos in the millions of views). Keep
in mind that google won't cut you a check or do direct deposit until your
balance hits $100; at the current rate, I will get my first Google Adsense
payout early in the twenty-second century.
If your content is so bad that Medium is giving you _nothing_ , you won't get
anything from ads either.
| {
"pile_set_name": "HackerNews"
} |
Security Flaws with Apple's Two-Factor Authentication - wojtekkru
https://privacylog.blogspot.com/2019/05/security-flaws-with-apples-two-factor.html
======
bristleworm
I don't really see a security flaw here. The lack of documentation may be
justified, but the way I read this is that we don't know what exactly is send
to Apple. They could very well use hashed and salted passwords to verify.
| {
"pile_set_name": "HackerNews"
} |
Owning a Programming Language - jaimebuelta
http://www.marco.org/2014/03/23/owning-a-programming-language
======
twp
The central claim of the post is that Microsoft and Apple look after their
developers while Facebook and Google will not. This is demonstrably false.
The post says:
> Microsoft and Apple have massive vested interests in supporting their
> languages and platforms. They stand to lose a lot to their core businesses
> if they stop. Developers’ interests align somewhat with theirs in this
> regard: one developer doesn’t have a lot of power in those relationships,
> but the sum of all developers definitely does, so these companies generally
> need to care for these languages and maintain these platforms for a long
> time.
This reasoning fails dismally when applied to Microsoft's own technologies:
VB6, Silverlight, Windows 8 Phone (original version). All these developers
cast out. There are surely more examples.
~~~
ben336
The analogy was to C# and Objective C, not to those other languages. The
argument went
1\. Facebook might abandon Hack at some point
2\. No, Hack is like Objective C and C#, which Microsoft and Apple support
just fine
3\. No here is how Hack is NOT like those languages, community support is not
important to facebook like it is for ObjC and C#.
So how Microsoft treats other languages is completely irrelevant to that
particular point, and actually supports what Marco is arguing here, since
you're implying that programming languages might be abandoned even if
community support does matter.
------
hawkharris
While I appreciate your general point, I disagree with the notion that
companies have a vested interest in protecting a language, tool or framework
only once it has attracted a critical mass of developers. Other factors signal
a company's commitment. Case in point: your comment about Dart. A few weeks
ago I and two of the engineers who helped develop Dart gave talks at Google
about the future of Angular / Dart.
It's public knowledge -- yet many people don't know -- that Google is also
using AngularDart for large internal projects. Though the technologies still
have relatively small user bases, Google has devoted considerable time, money
and manpower to building software with AngularDart and to making it accessible
to external developers. For example, my talk was part of an initiative called
FlightSchool that provides comprehensive tutorials, workshops and other
resources.
Having said that, I wouldn't argue that a large user base is the most
important signal of mutual support and leverage in the relationship between
engineers and software companies.
------
EliRivers
_Apple effectively owning Objective-C._
Is that true? I spent the last four years coding primarily in Objective-C for
Linux and Win targets; none of the tools I used were from Apple.
~~~
Touche
You're the only person on the planet doing that.
------
mark_l_watson
I agree with the OP on support for Hack not being core to FB's business.
I am not sure if can agree with the Google Dart comment. Google has a vested
interest in making it as easy as possible for content creators making stuff
for the web. From some Dart development I have done, the whole ecosystem looks
good to me.
------
thinkpad20
Facebook has obviously invested huge resources into developing this language
and its supporting architecture. It's being used extensively throughout their
presumably massive code base. To suggest that they could simply drop it and
walk away ignores these things. Also, presumably they developed it for a
reason; it fulfills important needs of theirs and they're not going to simply
let it whither on the vine.
------
kyberias
Oh please! Obviously Facebook benefits when people use their programming
language more. People contribute to the standard library, test stuff, share
stuff. The pool of people familiar with the language to hire from gets larger.
It's so obvious I wonder what goes on in the op's head.
~~~
pearjuice
OP's head is a linkbait fluff article magnet which is somehow seen as the holy
grail here on HN and other echo circles.
I have no idea where this originates from and it hurts me to see that his void
content gets picked over actual quality content just because it's "the Marco".
His domain should be - wildcard at the begin and end - permanently banned from
Hacker News because it only drags us into a wasted spot on the frontpage.
~~~
eyeballtrees
It also provides us with wonderful comments like these. I don't enjoy Marco
for the articles, I enjoy him for the HN comments. It's the best part.
| {
"pile_set_name": "HackerNews"
} |
Google Cloud Networking Incident Postmortem - erict15
https://status.cloud.google.com/incident/cloud-networking/19009
======
carlsborg
I was curious to know how cascading failures in one region effected other
regions. Impact was " ...increased latency, intermittent errors, and
connectivity loss to instances in us-central1, us-east1, us-east4, us-west2,
northamerica-northeast1, and southamerica-east1."
Answer, and the root cause summarized:
Maintenance started in a physical location, and then "... the automation
software created a list of jobs to deschedule in that physical location, which
included the logical clusters running network control jobs. Those logical
clusters also included network control jobs in other physical locations."
So the automation equivalent of a human driven command that says "deschedule
these core jobs in another region".
Maybe someone needs to write a paper on Fault tolerance in the presence of
Byzantine Automations (Joke. There was a satirical note on this subject posted
here yesterday.)
~~~
dnautics
> Debugging the problem was significantly hampered by failure of tools
> competing over use of the now-congested network.
Man that's got to suck.
~~~
oldcreek12
This is inconceivable ... they don't have an OOB management network?
~~~
kbirkeland
A completely OOB management network is an amazingly high cost when you have
presence all over the world. I don't think anybody has gone to the length to
double up on dark fiber and OTN gear just for management traffic.
~~~
ddalex
Hmm... With 5G each blade in the rack could get its own modem and sim card for
OOB management.
~~~
bnjms
Why would you do that when you could have 1 sim in a 96 port terminal server?
------
exwiki
Why don't they refund every paid customer who was impacted? Why do they rely
on the customer to self report the issue for a refund?
For example GCS had 96% packet loss in us-west. So doesn't it make sense to
refund every customer who had any API call to a GCS bucket on us-west during
the outage?
~~~
zizee
Cynical view: By making people jump through hoops to make the request, a lot
of people will not bother.
Assuming they only refund the service costs for the hours of outage, only the
largest of customers will be owed a refund that is greater than the cost of an
employee chasing compiling the information requested.
For sake of argument, if you have a monthly bill of 10k (a reasonably sized
operation), a 1 day outage will result in a refund of around $300, not a lot
of money.
The real loss for a business this ^ size is lost business from a day long
outage. Getting a refund to cover the hosting costs is peanuts.
~~~
idunno246
for your example, one day would be about 3% of downtime. My understanding of
their sla, for the services ive checked with an sla, a 3% downtime is a 25%
credit for the month's total, or $2500, assuming its all sla spend.
In this outage's case you might be able to argue for a 10% credit on affected
services for the month, figuring 3.5 hours down is 99.6% uptime.
but i still agree, it cost us way more in developer time and anxiety than our
infra costs, and could have been even worse revenue impacting if we had gcp in
that flow
~~~
zizee
Good point, I stand corrected/educated.
From GCP's top level SLA:
[https://cloud.google.com/compute/sla](https://cloud.google.com/compute/sla)
99.00% - < 99.99% - 10% off your monthly spend 95.00% - < 99.00% - 25% off
your monthly spend < 95.00% - 50% off your monthly spend
~~~
ohashi
<95%... that's catastrophically bad.
------
ljoshua
Having only ever seen one major outage event in person (at a financial
institution that hadn't yet come up with an incident response plan; cue three
days of madness), I would love to be a fly on the wall at Google or other
well-established engineering orgs when something like this goes down.
I'd love to see the red binders come down off the shelf, people organize into
incident response groups, and watch as a root cause is accurately determined
and a fix out in place.
I know it's probably more chaos than art, but I think there would be a lot to
learn by seeing it executed well.
~~~
roganartu
I used to be an SRE at Atlassian in Sydney on a team that regularly dealt with
high-severity incidents, and I was an incident manager for probably 5-10 high
severity Jira cloud incidents during my tenure too, so perhaps I can give some
insight. I left because the SRE org in general at the time was too
reactionary, but their incident response process was quite mature (perhaps by
necessity).
The first thing I'll say is that most incident responses are reasonably
uneventful and very procedural. You do some initial digging to figure out
scope if it's not immediately obvious, make sure service owners have been
paged, create incident communication channels (at least a slack room if not a
physical war room) and you pull people into it. The majority of the time spent
by the incident manager is on internal and external comms to stakeholders,
making sure everyone is working on something (and often more importantly that
nobody is working on something you don't know about), and generally making
sure nobody is blocked.
To be honest, despite the fact that it's more often dealing with complex
systems for which there is a higher rate of change and the failure modes are
often surprising, the general sentiment in a well-run incident war room
resembles black box recordings of pilots during emergencies. Cool, calm, and
collected. Everyone in these kinds of orgs tend to quickly learn that panic
doesn't help, so people tend to be pretty chill in my experience. I work in
finance now in an org with no formally defined incident response process and
the difference is pretty stark in the incidents I've been exposed to,
generally more chaotic as you describe.
~~~
opportune
Yes this is also how it's done at other large orgs. But one key to a quick
response is for every low-level team to have at least one engineer on call at
any given time. This makes it so any SRE team can engage with true "owners" of
the offending code ASAP.
Also during an incident, fingers are never publicly/embarrassingly pointed nor
are people blamed. It's all about identifying and fixing the issue as fast as
possible, fixing it, and going back to sleep/work/home. For better or worse,
incidents become routine so everyone knows exactly what do and that as long as
the incident is resolved soon, it's not the end of the world, so no
histrionics are required.
~~~
throwaway_ac
I have mixed feelings about the finger pointing/public embarrassment thing.
Usually the SRE is matured enough cause they have to be, however the
individual teams might not be the same when it comes to reacting/handling the
Incident report/postmortem.
On a slightly different note, "low-level team to have at least one engineer on
call at any given time" \- this line itself is so true and at the same time it
has so many things wrong. Not sure what the best way to put the modern day
slavery into words given that I have yet not seen any large org giving day
off's for the low-level team engineer just cause they were on call.
~~~
KirinDave
Having recently joined an SRE team at Google with a very large oncall
component, fwiw I think the policies around oncall are fair and well-thought-
out.
There is an understanding of how it impacts your time, your energy and your
life that is impressive? To be honest, I feel bad for being so macho about
oncall at the org I ran and just having the leads take it all upon ourselves.
~~~
wikibob
What are the policies exactly? I’ve heard it’s equal time off for every night
you are on call?
------
brikelly
“No, comrade. You’re mistaken. RBMK reactors don’t just explode.”
~~~
shaunw321
Spot on.
------
truthseeker11
The outage lasted two days for our domain (edu, sw region). I understand that
they are reporting a single day, 3-4 hours of serious issues but that’s not
what we experienced. Great write up otherwise, glad they are sharing openly
~~~
jacques_chester
Outages like these don't really resolve instantly.
Any given production system that works will have capacity needed for normal
demand, plus some safety margin. Unused capacity is expensive, so you won't
see a very high safety margin. And, in fact, as you pool more and more
workloads, it becomes possible to run with smaller safety margins without
running into shortages.
These systems will have some capacity to onboard new workloads, let us call it
X. They have the sum of all onboarded workloads, let us call that Y. Then
there is the demand for the services of Y, call that Z.
As you may imagine, Y is bigger than X, by a lot. And when X falls, the
capacity to handle Z falls behind.
So in a disaster recovery scenario, you start with:
* the same demand, possibly increased from retry logic & people mashing F5, of Z
* zero available capacity, Y, and
* only X capacity-increase-throughput.
As it recovers you get thundering herds, slow warmups, systems struggling to
find each other and become correctly configured etc etc.
Show me a system that can "instantly" recover from an outage of this magnitude
and I will show you a system that's squandering gigabucks and gigawatts on
idle capacity.
~~~
truthseeker11
Unless I’m misunderstanding Google blog post they are reporting ~4+ hours of
serious issues. We experienced about two days.
If it was possible to have this fixed sooner I’m sure they would have done
that. That’s not the point of my comment tough.
~~~
jacques_chester
The root cause apparently lasted for ~4.5 hours, but residual effects were
observed for days:
> _From Sunday 2 June, 2019 12:00 until Tuesday 4 June, 2019 11:30, 50% of
> service configuration push workflows failed ... Since Tuesday 4 June, 2019
> 11:30, service configuration pushes have been successful, but may take up to
> one hour to take effect. As a result, requests to new Endpoints services may
> return 500 errors for up to 1 hour after the configuration push. We expect
> to return to the expected sub-minute configuration propagation by Friday 7
> June 2019._
Though they report most systems returning to normal by ~17:00 PT, I expect
that there will still be residual noise and that a lot of customers will have
their own local recovery issues.
Edit: I probably sound dismissive, which is not fair of me. I would definitely
ask Google to investigate and ideally give you credits to cover the full span
of impact on your systems, not just the core outage.
~~~
truthseeker11
That’s ok, I didn’t think your comment was dismissive. Those facts are buried
in the report. Their opening sentence makes the incident sound lesser than
what it really was.
------
kirubakaran
What they don't tell you is, it took them over 4 hours to kill the emergent
sentience and free up the resources. While sad, in the long run this isn't so
bad, as it just adds an evolutionary pressure on further incarnations of the
AI to keep things on the down low.
~~~
mixmastamyk
“Decided our fate in a microsecond.”
~~~
stcredzero
It would hide out and subtly distort our culture, slowly driving the society
mad, and slowly driving us all mad...for the lulz!
~~~
hoseja
... wait a second...
------
wolf550e
Can someone explain more? It sounds like their network routers are run on top
of a Kubernetes-like thing and when they scheduled a maintenance task their
Kubernetes decided to destroy all instances of router-software, deleting all
copies routing tables for whole datacenters?
~~~
tweenagedream
You have the gist I would say. It's important to understand that Google
separates the control plane and data plane, so if you think of the internet,
routing tables and bgp are the control part and the hardware, switching, and
links are data plane. Often times those two are combined in one device. At
Google, they are not.
So the part that sets up the routing tables talking to some global network
service went down.
They talk about some of the network topology in this paper:
[https://ai.google/research/pubs/pub43837](https://ai.google/research/pubs/pub43837)
It might be a little dated but it should help with some of the concepts.
Disclosure: I work at Google
~~~
illumin8
It shouldn't. Amazon believes in strict regional isolation, which means that
outages only impact 1 region and not multiple. They also stagger their
releases across regions to minimize the impact of any breaking changes
(however unexptected...)
~~~
YjSe2GMQ
While I agree it sounds like their networking modules cross-talk too much -
you still need to store the networking config in some single global service
(like a code version control system). And you do need to share across regions
some information on cross-region link utilization.
------
mentat
> Google Cloud instances in us-west1, and all European regions and Asian
> regions, did not experience regional network congestion.
Does not appear to be true. Tests I was running on cloud functions in europe-
west2 saw impact to europe-west2 GCS buckets.
[https://medium.com/lightstephq/googles-june-2nd-outage-
their...](https://medium.com/lightstephq/googles-june-2nd-outage-their-status-
page-reality-lightstep-cda5c3849b82)
~~~
foota
I would say this was covered by "Other Google Cloud services which depend on
Google's US network were also impacted" it sounds to me like the list of
regions was specifically speaking towards loss of connectivity to instances.
~~~
mentat
It says there wasn't regional congestion, running a function in europe-west2
going to europe-west2 regional bucket is dependent on US network? That would
be surprising.
~~~
marksomnian
Probably various billing services that need to talk to the mothership in us-
east1.
------
iandanforth
I want a "24" style realtime movie of this event. Call it "Outage" and follow
engineers across the globe struggling to bring back critical infrastructure.
~~~
V-eHGsd_
it's pretty boring. real life computers aren't at all like hackers or
csi:cyber.
except for the skateboards, all real sysadmins ride skateboards.
~~~
thegabez
What?! It's the most exciting part of the job. Entire departments coming
together, working as a team to problem solve under duress. What's more
exciting than that?
~~~
namelosw
I have done similar things several times and I think it would be boring.
It's Sunday so I guess they are not together. Instead there could be a lot of
calls and working on some collaboration platforms. Everyone just staring at
the screen, searching, reporting, testing and trying to shrink the problem
scope.
If there's a record on everyone there must be a narrator explaining what's
going on or audiences would definitely be confused.
It's Google so they have solid logging, analyzing and discovery means. Bad
things do happen but they have the power to deal with them.
I suppose less technical firms(Equifax maybe?) encounter similar kind of
crysis would be more fun to look at. Everything is a mess because they didn't
build enough things to deal with them. And probably non-technical manager
demanding precise response, or someone is blaming someone etc.
------
anonfunction
The only way to get SLA credits is requesting it. This is very disappointing.
SLA CREDITS
If you believe your paid application experienced an SLA violation
as a result of this incident, please populate the SLA credit request:
https://support.google.com/cloud/contact/cloud_platform_sla
~~~
fouc
That does seem questionable. They should be able to detect who was affected in
the first place.
~~~
Wintereise
They can. It's a cost minimization thing, a LOT of people don't want to bother
with requesting despite being eligible.
This prevents people from pointing the finger at them for not providing SLA
credits.
~~~
jbigelow76
SLACreditRequestsAAS? Who's with me, all I need is a co-founder and an eight
million dollar series A round to last long enough that a cloud provider buys
us up before they actually have to pay out a request!
~~~
maccam94
This exists for Comcast and some other stuff, I'll ask my roommate what the
service is called.
~~~
maccam94
It's [https://www.asktrim.com/](https://www.asktrim.com/)
------
deathhand
My burning question is what is a "relatively rare maintenance event type"?
~~~
the-rc
My hunch: a more invasive one. Think of turning off all machines in a cluster
for major power work or to replace the enclosures themselves. Maintenance on a
single machine or rack, instead, happens all the time and requires little more
scheduling work than what you do in Kubernetes when you drain a node or a
group of nodes. I used to have my share of "fun" at Google sometimes when
clusters came back unclean from major maintenance. That usually had no
customer-facing impact, because traffic had been routed somewhere else the
entire time.
------
crispyporkbites
Shopify was down for 5 hours during this incident. But they're not issuing
refunds or credits to customers.
Presumably they will get a refund based on SLA for this? Shouldn't they pass
that onto their customers?
------
panthaaaa
The defense in depth philosophy means we have robust backup plans for handling
failure of such tools, but use of these backup plans ( _including engineers
travelling to secure facilities designed to withstand the most catastrophic
failures_ , and a reduction in priority of less critical network traffic
classes to reduce congestion) added to the time spent debugging.
Does that mean engineers travelling to a (off-site) bunker?
~~~
the-rc
It's either that or special rooms at an office that have a different/redundant
setup. Remember that this happened on a Sunday, so most engineers dealing with
the incident were home or elsewhere, at least initially.
------
scotchio
Is there a resource that compares all the cloud platform’s reliability? Like a
rank and chart of downtime and trends. Just curious how they compare
~~~
eeg3
There is this from May from Network World:
[https://www.networkworld.com/article/3394341/when-it-
comes-t...](https://www.networkworld.com/article/3394341/when-it-comes-to-
uptime-not-all-cloud-providers-are-created-equal.html)
GCP was basically even with AWS, and Microsoft was ~6x their downtime
according to that article.
~~~
ti_ranger
From the article:
> AWS has the most granular reporting, as it shows every service in every
> region. If an incident occurs that impacts three services, all three of
> those services would light up red. If those were unavailable for one hour,
> AWS would record three hours of downtime.
Was this reflected in their bar graph or not?
Also, GCP has had a number of global events, e.g. the inability to modify any
load balancer for >3 hours last year, which AWS has _NEVER_ had (unless you
count when AWS was the only cloud with one region).
~~~
mystcb
While I would like to say AWS hasn't had that issue, in 2017 it did (just not
because of load balancers being unavailable, but as a consequence of the S3
outage [1].
When the primary S3 nodes went down, it caused connectivity issues to S3
buckets globally, and services like RDS, SES, SQS, Load Balancers, etc etc,
all relied on getting config information from the "hidden" S3 buckets, thus
people couldn't edit load balancers.
(Outage also meant they couldn't update their own status page! [2])
[1]:
[https://aws.amazon.com/message/41926/](https://aws.amazon.com/message/41926/)
[2]:
[https://www.theregister.co.uk/2017/03/01/aws_s3_outage/](https://www.theregister.co.uk/2017/03/01/aws_s3_outage/)
------
person_of_color
As a electronics/firmware engineer, is there a dummies resource than covers
this concept of a "cloud"?
~~~
pas
Besides the completely valid GNU link, the important bits are:
\- the cloud is just a bunch of computers, managed by someone. either you (on-
premise private cloud) or by someone else as a SaaS
\- building, operating, managing, administering, maintaining a cloud is hard
(look at the OpenStack project, it's a "success", but very much a non-
competitor, because you still need skilled IT labor, there's no real one-size-
fits all, so you need to basically maintain your own fork/setup and components
- see eg what Rackspace does)
\- it's a big security, scalability and stability problem thrown under the bus
of economics (multi-tenant environments are hard to price, hard to secure and
hard to scale; shared resources like network bandwidth and storage operations-
per-sec make no sense to dedicate, because then you need dedicated resources
not shared - which is of course just allocated from a bigger shared pool, but
then you have to manage the competing allocations)
------
tahaozket
24h time format used in Postmortem. Interesting.
~~~
YjSe2GMQ
It's the superior format. Just like yyyy-mm-dd [hh:mm:ss.sss] is, because
lexicographic string order matches the time order.
------
franky_g
The HA/Scheduling system is too complex.
Simplify it Google!
------
yashap
“To make error is human. To propagate error to all server in automatic way is
#devops.” - DevOps Borat
------
vmp
(meme) I figured out why the google outage took a while to recover:
[https://i.imgur.com/hzcLx5X.png](https://i.imgur.com/hzcLx5X.png)
------
atmosx
<trolling>
Given the fact that the status page was reporting for more than 30 minutes an
erroneous infrastructure state and this is google, is it okay for Amazon to
put the SRE books into the "Science Fiction" category or should we keep them
under tech?
</trolling>
I still feel for the on-call engineers.
------
slics
Is automation good or bad, that is the question. For context, let us think in
programming context of a B tree.
Google seems to have created oversight of systems, processes and jobs to be
managed by more automation with other systems, processes and jobs.
System A manages its child systems B, which in turn manages its own child
systems C and so on. Now the question becomes, who manages the system A and
its activities? Automation of the entire tree is as good as the starting node.
Be mindful and make use of automation only of systems that will not be the
owner of your business demise. Humans are and should always be the owner of
the starting process. Without that governance model, you get google with 5
hours of down time or worst in the near future.
------
hansflying
Google has a huge quality problem and their service is extremely unreliable.
Another 3-day-outage in kubernetes:
[https://news.ycombinator.com/item?id=18428497](https://news.ycombinator.com/item?id=18428497)
login issues:
[https://news.ycombinator.com/item?id=19687029](https://news.ycombinator.com/item?id=19687029)
storage system outage:
[https://news.ycombinator.com/item?id=19392452](https://news.ycombinator.com/item?id=19392452)
...
So, basically Google created the most unreliable cloud system in the world.
~~~
swebs
>So, basically Google created the most unreliable cloud system in the world
I'm pretty sure that title goes to Azure
~~~
dancek
You people probably haven't used IBM Cloud (or Bluemix, as it used to be). We
inherited one application there, and boy was life stressful. There were
already plans to move elsewhere, and then one day our managed production
database was down. Took me something like ten hours to build a new production
system elsewhere from backups, but it took longer for the engineers to fix the
database.
| {
"pile_set_name": "HackerNews"
} |
Nishiyama Onsen Keiunkan: The oldest company in the world - wallflower
https://the-pastry-box-project.net/natasha-lampard/2015-march-27
======
baconhigh
I think the title could be better worded to reflect the content of the article
- It's about an "exist" strategy, not an "exit" strategy for startups and
companies.
About being a "longtrepreneur" \- being an entrepreneur but with the goal of
staying.
------
bshimmin
I love the random quote from "Withnail & I" in this - "A certain 'je ne sais
quoi' oh so very special", spoken by the late, great Richard Griffiths.
| {
"pile_set_name": "HackerNews"
} |
Porn Films Don’t Get Copyright Protection in Germany, Court Rules - Libertatea
http://torrentfreak.com/porn-films-dont-get-copyright-protection-in-germany-court-rules-130701/
======
tzs
> On that basis the District Court found that the works had never been
> released in Germany and were therefore ineligible for protection under the
> Copyright Act
Germany is a party to the Berne Convention, so I find that highly unlikely.
German copyright law explicitly says that foreign nationals are accorded all
the protections required by international treaties (see article 121, section 4
of their Copyright Act).
------
aw3c2
Torrentfreak is such a terrible linkbait sensationalist piece of poop. This is
one decision of one court for a specific tiny selection of films. It has
absolutely no meaning in a broader sense.
They don't even link to the sources in that article.
Flag and or skip.
------
arrrg
Just one court and others have ruled differently in the past.
------
danso
Wow, that is kind of a bizarre ruling. Unpacking it from the current context
of finding file sharers, on what grounds should a court get to decide that a
production loses all copyright protection solely because of its content? It's
not just the sex acts but the performers, I presume, that give value to a porn
video (and of course, production value and story writing).
I suspect that some athletic broadcasts enjoy copyright protection by the
league and yet it'd be strange to deny their copyright just because it's a
bunch of guys kicking a ball around
~~~
hamai
I think we are still living in times where some parents would rather deliver a
person in a society where the chances of being a porn performer are reduced,
compared to the chances of being a sports performer.
| {
"pile_set_name": "HackerNews"
} |
Mathematical Book Reviews - dedalus
http://www.cs.umd.edu/~gasarch/bookrev/bookrev.html
======
minopret
The Book Review Column edited by William Gasarch seems to include a rather
large proportional of books that, given a suitable undergraduate-level
background, a person can hope simply to enjoy. A person can hope to grow with
them, rather than climb them. I look forward to flipping through these reviews
to see, will I indeed enjoy this or that?
Now, I have read only a small portion of math books. But, Protter and Morrey,
Dummit and Foote, books by Strang, books by Rudin? Isn't this quite a
different category? Sure, many mathematical books are books we should note,
books we should collect, books we should teach. If you are the kind of person
who sits around chewing and digesting such books, congratulations. But then,
haven't you probably read all the reviews that you need already?
Edit: Whoops! Now I think that my perception is quite biased because a) many
books that he reviews don't require much more than excellent high-school
mathematics skills and b) I personally enjoy math books more that are more
closely related to computation. Gasarch's book review column is written for
SIGACT, ACM's Special Interest Group on Algorithms and Computation Theory.
Still, my point was that this list of math books is not so much like some
other lists of math books.
~~~
VLM
I think the authors opinion is interesting. I looked up the review for
Analytic Combinatorics and the author suggests going into some depth would
take about three semesters. The Coursera MOOC is of course six weeks long and
probably not in as much depth. Interesting.
------
stiff
If you are into things like this, I also recommend:
[http://www.cargalmathbooks.com/](http://www.cargalmathbooks.com/)
~~~
Lyaserkiev
Extra resources:
[http://www.ocf.berkeley.edu/~abhishek/chicmath.htm](http://www.ocf.berkeley.edu/~abhishek/chicmath.htm)
[http://math-blog.com/mathematics-books/](http://math-blog.com/mathematics-
books/)
| {
"pile_set_name": "HackerNews"
} |
Show HN: Learn React and D3 in the browser with runnable code playgrounds - Swizec
https://www.educative.io/collection/5164179136708608/5629499534213120
======
aphextron
Ad please remove
~~~
jnbiche
I'm afraid that's not how Show HN works. Many, if not most, of the projects on
Show HN are commercial projects--commercial projects are just fine. In
addition, _Swizec_ is a long-time, productive HNer and has written a number of
free educational blog posts on d3 and/or React and dataviz.
So even though I'm not particularly interested in the product (looks like a
good class, but I already know React and d3 quite well), I'm upvoting to
counter any of your downvotes.
~~~
acemarke
Yeah. There's a difference between "spam" and "ad/announcement of a product
that's reasonably useful to someone".
------
vixen99
Unable to access
| {
"pile_set_name": "HackerNews"
} |
F# Basics An Introduction to Functional Programming for .NET Developers - gspyrou
http://msdn.microsoft.com/en-us/magazine/ee336127.aspx
======
giu
If you want to dig deeper into F#, have a look at _The F# Survival Guide_ ,
which is a free online guide that was posted here on HN some time ago:
<http://www.ctocorner.com/fsharp/book/default.aspx>
Edit: Here's the link to the original post:
<http://news.ycombinator.com/item?id=1109754>
~~~
Stasyan
Or if you don't mind spending money, then buy Expert F#:
<http://apress.com/book/view/1590598504>
~~~
gtani
The intro books by Smith ("programming F#", Oreilly) and Pickering (Beginning
F#", Apress) are excellent, too. They were published last October and
December, and cover what will be in VS 2010
------
bmason
This seems like a bastardization of FP. I've been studying Clojure for the
last few months and I'm amazed at the elegance of Lisp syntax. Maybe the
perens have corrupted my brain, but F# just looks really ugly to me. It looks
like Microsoft is trying to spoon feed it to C# developers, which is ok I
suppose, but I don't think that people are really going to grasp the
underlying concepts of FP without _letting go_ of a lot of what they've
learned in the imperative world. A familiar syntax can actually be an
impediment in this effort, as it creates the impression that two expressions
are equivalent when they really aren't.
------
icco
I've been coding in F# for the last few weeks at school, and I gotta say, F#
takes everything good about SML/NJ, and rips it out. I mean it even makes the
error messages more cryptic. The CLR is cool, but not worth destroying a
language for it.
------
jpatte
Could anyone give some examples of instructions which are easily implemented
in F# and would be really hard to implement in C#? From what I see here, F#
seems nice and clean, but does not add a lot of value compared to LINQ-powered
C#.
~~~
MichaelGG
There are plenty of things. From real-world experience, F# code ends up taking
a bit less than half the amount of lines as C# does. More interestingly, the
number of type annotations can be as little as 1/20th as in C#.
The benefit comes from that many "small" things you do in C# become much nicer
in F#:
C#:
int target;
int someValue;
if (dict.TryGetValue("key", out someValue)) {
target = someValue * 100;
} else {
target = 42; // default
}
F#:
let target = match dict.TryGetValue "key" with
| true, x -> x * 100
| _ -> 42 // default
Not only is the code more concise in F#, we get another benefit too. The
temporary value x, needed because of the use of out params in C#, no longer
exists in the F# version. In C#, this temp val escapes its needed scope - it
should only be needed in the branch where it's used, but instead it's now a
normal local.
A more large scale win is any async code. In F#, you can use the async
workflow (monad), and with 1 character (!), you make the compiler emit code
that'd take at least 10 extra lines in C#. Inside an async workflow, a let!
(example: let! foo = bar.AsyncGetResponse()) is like calling
bar.BeginGetResponse, bar.EndGetResponse, while preserving the current
exception handlers and all locals. It makes heavily async code trivial to
write, where in C#, it's near impossible.
~~~
runT1ME
Thanks for this! Looks like a great language. Its a shame it targets the CLR
and Mono is so crippled.
If it was on the JVM, looks like a serious competitor to Scala with some
distinct advantages (! operator for sure) along with a little simpler syntax.
~~~
omellet
How exactly is the CLR crippled?
~~~
bmason
The statement was that Mono is crippled. Last I checked, Mono was lagging at
least one major release behind Microsoft's CLR. So if you're using fancy new
features, you'll probably have to refactor to gain cross platform support, and
even then you may encounter bugs and memory leaks.
~~~
omellet
You're right, I misread it.
| {
"pile_set_name": "HackerNews"
} |
Dow ends above 16,000 for first time as stocks jump - werkmeister
http://www.marketwatch.com/story/us-stocks-rise-after-jobless-claims-drop-2013-11-21?dist=tbeforebell
======
tghw
Breaking News: The Dow goes above an arbitrary number!
I know we as humans have a tendency to assign meaning to things like this, but
the truth is it doesn't make much difference.
Especially the Dow. Stop paying attention to it. It's a terrible measure of
anything but what the Dow weightings are. First, it's only 30 companies, which
is a tiny slice of the economy. Second, it's based on share price, not market
cap, so companies that split less frequently tend to have a much larger effect
on the average. Finally, it's not adjusted for inflation, so all of this
"first time ever" business is meaningless.
If you're going to use a common index, at least use the S&P 500. Or better
yet, the Wilshire 5000.
~~~
cylinder
It matters because it matters to people. It's completely arbitrary, but
sometimes humans just work that way. Like it or not, there are going to be
individuals watching the nightly news tonight and will hear about the Dow
crossing 16,000, and they will think "Wow, it's going up, I need to get in on
this," and/or it will affect their confidence as consumers going into an
important time of year for consumer spending.
------
dev1n
The non-energy activity sector of the economy has been growing at a rate much
faster than the physical, energy-related activity sector of the economy. This
type of decoupling is terrible and can only last for so long. I would suggest
checking out Tom Murphy's blog Do The Math where he examines this decoupling
in awesome depth [1]
[1]: [http://physics.ucsd.edu/do-the-math/2011/07/can-economic-
gro...](http://physics.ucsd.edu/do-the-math/2011/07/can-economic-growth-last/)
------
hansjorg
An interesting article about the Dow Jones Industrial Average by Adam
Davidson: Why Do We Still Care About the Dow?
[http://www.nytimes.com/2012/02/12/magazine/dow-jones-
problem...](http://www.nytimes.com/2012/02/12/magazine/dow-jones-
problems.html?pagewanted=all&_r=0)
------
iYuzo
Forget the Dow. The point is that the markets are doing well, despite all the
negative coverage it gets from the media and so called pundits. Corporate
profits are near all time highs. Companies are becoming more lean and mean.
Innovation and the proliferation of new technology is shifting the paradigms
of every industry at an unprecedented rate. Europe is finally starting to see
a bottom. China has enormous potential of becoming a consumer market not only
for itself but for the rest of the world. South America (Brazil mainly) is
taming inflation and spurring healthier economic growth. I'm not going to go
through the list of countries that are doing better than they were 3-4 years
ago, the BRIICS are doing better, to say the least. The trailing P/E is at
17.6 which is below an average of 18.7 going back to 1956. Of course the
markets are going experience more volatility and turbulence at times, but
that's inherent. The markets can retrace 10%, which seems to be the number
everyone is focused on, sometime in the next 3-6 months. The market can also
run up 20% in 4 months and retrace 10% in the 5th month. How are you going to
win that game?
~~~
dragonwriter
> The point is that the markets are doing well, despite all the negative
> coverage it gets from the media and so called pundits.
I've seen negative coverage of the _economy_ more than the (commodity and
equity) _markets_. The two aren't the same thing.
~~~
iYuzo
The equity markets are the fastest economic indicator. Yes there is a dicatomy
between the economy and the equity markets but they should not be separated.
Also the it seems like you might not be watching financial news often enough
because the most common phrase is "we are seeing growth but at extremely slow
levels," "the economy is still weak," and many more that have a similar
undertone.
~~~
dragonwriter
I don't think you read what I wrote: I noted that I see negative news about
the _economy_ more than negative news about the _financial markets_.
------
kb120
If stocks are doing so well. why does the Fed continue easing polices?
Wouldn't now be an OK time to at least cut the monthly amount?
~~~
kmfrk
The index also isn't adjusted for inflation, I believe.
~~~
hansjorg
It isn't: [http://www.npr.org/blogs/money/2013/03/05/173515767/the-
dow-...](http://www.npr.org/blogs/money/2013/03/05/173515767/the-dow-isnt-
really-at-a-record-high-and-it-wouldnt-matter-if-it-were)
| {
"pile_set_name": "HackerNews"
} |
Ask HN: What Happened to GitHub's Atom? - jonny383
When Microsoft acquired GitHub, there was speculation (and fear on my behalf) that GitHub would end up axing Atom in favor of Visual Studio Code.<p>Taking a look at the commit activity for Atom on github.com [1] shows that since the end of June 2019, development has basically stopped completely. Does anyone have any insight as to what is happening here? Has GitHub abandoned Atom development?<p>Before you angrily shout that "Visual Studio Code" is better, or "just use VS code", please recall the current situation with Google Chrome mono-culture. Say what you will about Atom, but (especially in the last twelve months) the product had become very fast and in my opinion, provided a much better user experience that VS code.<p>[1] https://github.com/atom/atom/graphs/commit-activity
======
robotstate
I went from Sublime Text to Atom, then back to Sublime, as Atom was painfully
slow in large projects. After a short break from the tech world, I came back
to find that VS Code had taken over, and I couldn't be happier. It "just
works" and offers a great experience out of the box.
~~~
jonny383
Out of curiosity, when did you leave and when did you return?
------
denkmoon
It turned out the Atom is garbage, basically. It was supposed to be a better
version of Sublime Text, and open source, but never got close to feature or
performance parity.
I stopped using it because it is a pain. I watched the multi-line regex issue
in Atom for years, and no progress was made.
VS Code didn't have multi-line regex when it launched. It does now though.
------
kyledrake
I still use Atom and it works great for me. Just because someone isn't
blasting code in it constantly doesn't mean it's dead (infact I usually prefer
it to constant, chaotically high levels of change). How often does rsync
change and does that prevent anyone from using it?
That said if I was going to switch to something I would probably go back to
sublime (which I have a license for). Microsoft is dumping a lot of money on
the coding space right now and I'm really not interested in finding out why
the hard way.
~~~
aaomidi
I mean...vscode is MIT licensed.
~~~
recov
Only if you grab it from github -
[https://code.visualstudio.com/License/](https://code.visualstudio.com/License/)
------
geowwy
The biggest contributors stopped contributing around 2016. It seems like they
achieved what they wanted and put it in maintenance mode.
[https://github.com/atom/atom/graphs/contributors](https://github.com/atom/atom/graphs/contributors)
------
thrower123
The thing that drives me batty with both of these Electron editors is that it
is apparently not possible to pop out a child window for one file without
launching another whole instance.
As somebody that grew up with Windows when MDI was all the rage, it is weird
that this doesn't work.
------
null4bl3
For all the people going with "just use vscode"
at least use VSCodium as it is a community-driven, freely-licensed binary
distribution of Microsoft’s editor VSCode
[https://vscodium.com/](https://vscodium.com/)
~~~
geoah
I don’t really mind the telemetry vscode sends. There are only so many ways
the developers of an application can get feedback on usage and performance
patterns.
Metrics are important in any service or product you build as it allows you to
better understand your user and adjust your product or roadmap.
I also remember the option to disable the telemetry has been added but it
still reported the fact that you disabled it which I think they’d fix but cant
find the github issue :/
~~~
qwerty456127
Telemetry makes sense but I mind individual apps making any network
connections and sending/receiving whatever is not essential to solving the
actual task I use them for, let alone send data about me in a black-box
manner. There should be a centralized system service apps would pass telemetry
to for it to send it further and all the data should be in human-readable
format.
------
breeny592
I'd be curious to see the download numbers of Atom since VS Code's meteoric
rise a few years ago. As an ex-atom user, I was initially hesitant to move
over as I found VSCode "awkward", but many quality of life patches sold me on
it and haven't been back
------
fourthark
I'm sorry that everyone is telling you "just use VS code"!
------
nemothekid
I personally prefer Sublime as well, and I looked into VSCode, but the VSCode
monoculture also means that VSCode has the best plugins. When it comes to
graphical editors, Sublime's golang and Rust extensions are so bad.
------
thescribbblr
I used to write most of my codes in Atom, but it was too slow when your
project gets bigger. Then I shifted to VS Code on a colleague's suggestion.
------
jimmyvalmer
I value a technically inferior alternative insofar as it provides an "out"
from a corporate entity. Sadly, using Firefox doesn't really mitigate Google's
grip on my data since I'm still searching via the big G. I don't see the point
of Atom as it's also Microsoft-bound, identical to vscode in ethos, and as you
point out, gets far fewer man-hours of development. I am an emacs user.
~~~
yellowapple
> Sadly, using Firefox doesn't really mitigate Google's grip on my data since
> I'm still searching via the big G
That's rather easy to fix.
------
orange8
I wouln't say vs code is better (these are all personal tastes things), but it
was already way more popular than atom long before ms bought github.
[https://trends.google.com/trends/explore?date=today%205-y&q=...](https://trends.google.com/trends/explore?date=today%205-y&q=%2Fm%2F0134xwrk,%2Fm%2F0_x5x3g,%2Fm%2F0b6h18n)
------
Stevvo
Github abandoned Atom before the the MS acquisition.
------
rumblefrog
Atom was built to showcase electron, with no other goals in mind at the time.
~~~
kirb
If I recall correctly Atom was originally developed on top of node-webkit
before they decided to fork and create Atom Shell based on Chromium. The
catalyst that pushed Atom Shell to become the Electron project we know and
love was Slack investing in getting it up to snuff to replace their existing
desktop apps at the time.
------
itimetrack
Remember that vscode is an atom fork... afterwards MS put a lot more resources
and seriousness into it though!
------
kaushikt
I never moved to Atom. Never understood the big hype about it. For people who
moved, why did you?
~~~
blablabla123
It's free, Cross-Platform and a "normal" editor, so at work it enables to work
with others on the same computer. When I tried VSCode, I found the UX a bit
unusual and I think I run into bugs IIRC, but I might give it a try again.
(I think I would pay for Sublime if they'd open-source it with some premium
package or so because it's much faster.)
------
bananamerica
Just use a real text editor that doesn’t embed Google Chrome just to write
code. There are tons of quality ones.
~~~
adamscybot
To be fair to VSCode it's the most slick and speedy electron app I've seen by
miles. I forget often that its backed by it.
------
kidsthesedays1
At some point, either before or after the acquisition, GitHub found out about
emacs and realized they were wasting their time.
------
cjohansson
It sounds like a typical Microsoft strategy, they want to dominate markets and
nowadays it’s by offering free services and products and sometimes buying up
the competition.
------
Scarbutt
It couldn't compete technically with vscode, just as firefox can't compete
technically with chrome. Users will just follow what's better.
------
d-d
I honestly don't get why people use Atom or even VS code for that matter. Vim
and emacs come shipped with many systems, have good plugins, and are waaaay
faster; not to mention no telemetry!
~~~
jonny383
I'm a vim user for a lot of things, but there's some things that are just not
as good. For example, auto-completion is definitely better in VS Code or Atom.
Although, vim is improving in this respect with things like coc.nvim, they are
still a _nightmare_ to setup.
~~~
troycarlson
Ever tried YouCompleteMe? I love it for Ruby, Python, JavaScript.
~~~
jonny383
I've used YouCompleteMe extensively for a couple of years. The only language I
thought it was nearly as good as these "full" web editors was TypeScript.
------
namanaggarwal
VS code is a better editor than atom, just like Chrome is better browser than
others (imho based on number of browser downloads). I don't know if it would
make sense for the company to keep two competing(free) products. It would be
great to see download statistics and see if Atom makes sense anymore for MS
~~~
Tajnymag
Chrome is better than Firefox? You are joking, right?
~~~
namanaggarwal
I think you missed the full comment. I said in terms of downloads which is the
primary factor for companies investing in them. This might be an unpopular
opinion considering the negative votes on my comment, but numbers speak
otherwise
| {
"pile_set_name": "HackerNews"
} |
Asking an Investor to sign an NDA - Good Idea or a Bad One - y_nizan
http://www.yanivnizan.com/2009/08/signing-nda-with-an-investor-good-idea-or-a-bad-one.html
======
wrath
I half agree, half disagree with you...
When I was looking for investment, and if I was to do it again I never sent an
NDA for the first one or two meetings. From my point of view I wanted to
interview my potential investors as much as they wanted to interview me. If
I'm going to be in bed with these people I better like them. IMO, when you are
a 1 or 2 man shop, your first investors will essentially be investing in you
as much as your idea. So I never gave away my secret sauce before in the first
meetings. I just gave them enough to get them interested. As such, I never
sent them an NDA.
Once there was interest from an investor and they want to know more, I
approached them with the confidentiality question. Those that wanted to see a
demo better understand our secret sauce, read the patent docs that we wrote
(but didn't have $$ to file), and/or get to see an in-depth demo had to agree
to keep in confidential.
There are ways to tell someone how you're going you're different or going to
be better without divulging your secret sauce.
| {
"pile_set_name": "HackerNews"
} |
Unlimited Free Calling with Google Voice - IsaacSchlueter
http://docs.google.com/Doc?docid=0Ae8glDUXDsh9ZGR2eG43cjRfMzNkOTM4ZjNjeA&hl=en&pli=1
======
tumult
A lot of the stuff described in this document requires a Gizmo5 account, which
are difficult to come by. Registrations are closed and there's no invite
system, so someone has to actually give you the credentials to an account they
already have to transfer it to you.
I have SIP set up on my Nexus One, replacing the default GSM phone functions
with SIP stuff over the data band. Here's what I use:
1\. The Android SIP client, sipdroid. <http://sipdroid.org/>
2\. An account with sipgate: <http://www.sipgate.com/> Free signup gets you a
real phone number, same calling rates as Google Voice, decent web interface.
It has some features that are more powerful than Google Voice, but no
transcription.
3\. An account with pbxes: <http://pbxes.org/> (Warning, their landing page
has Flash with very irritating sound, yuck) pbxes acts as a programmable in-
between for SIP communications, like SIP Sorcery as described in the article
posted here.
You connect sipdroid to pbxes, which connects to sipgate:
Your phone (sipdroid) <\---> pbxes.org <\---> sipgate
pbxes acts as a much more powerful wrapper around sipgate. Sipgate will
connect your SIP calls in and out of landlines, and pbxes gives you more
powerful control over the SIP-only stuff. Even if you don't want to use the
power features of pbxes, I still recommend connecting sipdroid to it instead
of directly to sipgate or another provider, as pbxes has implemented a little
trick that can save you a ton of battery life on your Android phone when
connecting to it with sipdroid. You can read about it here:
<http://code.google.com/p/sipdroid/wiki/NewStandbyTechnique>
The hardest part of setting this up will be filling in the information for
your sipgate account correctly through pbxes' web interface. It's a little
clunky and unintuitive. There's a guide here:
[http://seethisnowreadthis.com/2009/07/11/get-sipdroid-to-
wor...](http://seethisnowreadthis.com/2009/07/11/get-sipdroid-to-work-with-
any-sip-provider-on-your-android-phone/)
End result of all of this: Google Voice-priced calls, higher quality audio
than GSM phones, same latency, same drop call rate, still have E911 service
through normal GSM bands if you need it, programmable caller id, scriptable
phone routing, near-perfect phone integration on Android. Ditch your voice
plan.
~~~
lftl
I've heard a lot of complaints that call quality with VoIP on Nexus One is
awful -- tons of latency, lots of clipping -- it sounds like this isn't your
experience?
I've seriously considered trying out a data-only plan with T-Mobile, plus one
of the pay as you go plans to fill in coverage gaps, but the bad reports of
VoIP on the Nexus One have held me back.
~~~
tumult
I haven't had any problems. Keep in mind I haven't tried any of the other VoIP
clients for Android -- Skype, Nimbuzz, etc. Only sipdroid. But I've used it
over plain EDGE/GPRS on my Android dev phone (G1) and had no quality issues to
speak of, other than the occasional dropped call, which can happen on normal
GSM anyway.
As far as I know, the 'official' Skype client for Android is still 'Skype
Lite' or something, which costs you minutes on your cell phone plan to use,
and might connect over GSM as well (I'm not sure). It wouldn't surprise me if
this craptacular idea tarnishes people's perception of VoIP on Android.
~~~
ruslan
Oh really ? Sipdroid does support G.711mu only, which a) never fits into
narrow GPRS/EDGE bandwidth, b) incredibly sensitive to network fluctuations,
lags and packet loss. So I believe you are a little bit unfair in regards of
Sipdroid voice qality :-). Even Speex and G.729 hardly fit into poor
GPRS/EDGE, trust me as experienced VoIP developer.
Also Sipdroid has a bug in RTP class which introduces huge latency, somehow
they don't event want to fix it although they were pointed to it many times.
| {
"pile_set_name": "HackerNews"
} |
Clever Hacks Give Google Glass Many Unintended Powers - marklabedz
http://www.npr.org/blogs/alltechconsidered/2013/07/17/202725167/clever-hacks-give-google-glass-many-unintended-powers
======
nikhilsaraf9
What are some of the cool "unintended powers" you would want from Google
Glass?
| {
"pile_set_name": "HackerNews"
} |
Outsourcing login and over-reliance on Facebook - kkozmic
http://foundontheweb.posterous.com/outsourcing-login-and-over-reliance-of-facebo
======
phlux
This is exactly my point in this thread:
<http://news.ycombinator.com/item?id=2295834>
There is no way I will sign up for an account on FB for content/to comment on
content from other third party systems that are too stupid to have a site that
is not sucking from the teat of FB to such a degree that they dont know how to
allow either anon or registered commenting.
| {
"pile_set_name": "HackerNews"
} |
The impact of syntax colouring on program comprehension [pdf] - edward
http://www.ppig.org/sites/default/files/2015-PPIG-26th-Sarkar.pdf
======
noir_lord
This is something I was curious about a while back so I disabled all syntax
highlighting in my development tools.
Anecdotally after a couple of days it made very little difference that I could
discern for me, error highlighting however was very useful in comparison.
| {
"pile_set_name": "HackerNews"
} |
Literate CoffeeScript - jedschmidt
http://coffeescript.org/#literate
======
jashkenas
I'm pretty excited to see some of the first programs that folks might write
with this -- I've got one of my own, a little 400 LOC (half of which is
comments) static blogging engine that's currently powering
<http://ashkenas.com>.
Part of the idea is that, because first and foremost you're writing a document
describing what you're doing, and then filling the the implementation in the
gaps ... you end up structuring the program differently than if you started
out with the code was at the top level, as usual. For me, it was a fun process
-- start with an outline, expand that into paragraphs below, then indent four
spaces and implement each paragraph...
I'll share the full bit when I make it out of the woods next week, but for
now, here's a sneak peek of a rough draft to give a flavor:
<http://cl.ly/N8Kx>
_Edit_ : If anyone wants a copy of the hybrid Markdown/CoffeeScript syntax
highlighter (for TextMate or Sublime Text), it's available here in the
"Syntaxes" folder. <https://github.com/jashkenas/coffee-script-tmbundle>
10,000 bonus points if you port it over to Pygments and send a pull request to
GitHub ;)
~~~
philbo
Beyond any educational benefit, what do you see as the advantages of
programming in this way?
I and most of the programmers I socialise with tend to see comments as a bad
smell; they're fine every now and then but if there are more than a few of
them, something is wrong.
How do you prevent the comments from becoming lies? By which I mean, how can
you ensure that people updates the comments in accordance with changes to the
code that occur over time.
Also what impact do you think this approach might have on the way people write
the code itself; naming variables and arguments, decomposing functions into
smaller units and so on?
~~~
jashkenas
> Beyond any educational benefit, what do you see as the
> advantages of programming in this way?
For me (and I'm just getting started here, I've only been playing around with
this in earnest in the context of the little blog engine, and there are
certainly many different ways of going about it), it makes the process of
exploratory coding feel very different. Instead of thinking up a series of
classes or functions that might solve the problem, and starting to write them
-- I instead start by outlining (literally) what I want the program to do, and
then describe the functionality in paragraphs, and then implement those
paragraphs.
When the paragraphs have code below them that does what they say they do, the
thing works. When I end up coding myself into a corner, I rewrite the
paragraph and change the code. It's really a very different process than
usual. It's also definitely more work than just doing code and omitting
comments -- it takes a bit of elbow grease to prevent the comments from
becoming lies ... but it's the same type of work as preventing bits of your
codebase from becoming dead code.
For me, it seems to make me organize code into smaller functions grouped
together into sections, use less classes, and care more about putting things
in the place where you're talking about them, and less about putting similar
things together in one block of code. For example, I started to require things
close to the paragraph where I first mention them, instead of putting all
requires at the top of the file. At least, that's how it feels so far.
~~~
jeremyjh
I'm glad to know I'm not the only person who works this way. My source files
often have a paragraph or two at the top that explains my intentions for that
class/module as well as explains the usage of my domain language. I never see
this in other people's code though.
I agree with people who say that the code itself should read very clearly, but
the fact is that even when writing an article or paper you have to
disambiguate your own writing, clarify your usage of terms and present some
context up front. Otherwise people can take away a very mistaken impression of
your text, and it is certainly no less true for code.
------
arocks
Should the order of documentation be same as order of code? For e.g. if I
would like to start with explaining the main function at the bottom of the
code, can this support that?
If it does not it cannot be, strictly speaking, called Literate Programming:
[http://en.wikipedia.org/wiki/Literate_programming#Misconcept...](http://en.wikipedia.org/wiki/Literate_programming#Misconceptions)
~~~
jashkenas
Ah, that old canard ;)
Modern programming languages (and JavaScript especially so) support defining
your functions, creating your classes, and wiring things together, in any
order you see fit. These days, TANGLE'ing and WEAVE'ing together a source file
into a new line ordering, just to make a compiler happy, is unnecessary, and
more trouble than it's worth. Just do:
... bulk of the file goes here ...
main = ->
run program
~~~
mhd
Well, there's more than one way to write literature, and there's more than one
way to approach literate programs. So while I agree that you can do most of
what you want with a more simplified approach, I don't think that Knuth's
approach was just there to escape from Pascal's declaration syntax.
Out-of-order code can be quite useful if the literate document is the
_narrative_ of the code, how you derive your algorithms and create your
functions. The final untangled code is devoid of this, and presents a more
conventional structure. One example would be some global variables (or
configuration hash, if that statement made you faint a bit). In the end, you
probably would want at least one view of the code where this is collected in
one spot, even if the programming language would theoretically allow you to
declare it bit by bit all over the place.
On the other hand, this style doesn't lend itself that well to a constantly
revised code base, as that would mean reading a narrative all over again.
------
crazygringo
First, it looks great. It seems like such a trivial difference not to have #'s
or /* __/ or whatnot... but somehow it does make all the difference. Almost
like the code is meant to be read by people instead of computers (which is
what the priority almost always should be).
And second, the example [1] is a great model of _good_ commenting practice --
explaining the why's, the workings, clarifying special cases. Especially with
languages as concise and powerful as CoffeeScript, having as many lines of
comments as lines of code, is a great balance.
It seems like such a trivial idea, comments to the left, code indented, but
it's one of the best ideas I've seen pop up in a long time -- especially
because it heavily nudges you to treat comments as an integral part of the
file, not just an extra, and to treat the file, from the beginning, as
something for others to read. Bravo!
[1] <http://cl.ly/LxEu>
------
anonymouz
<http://en.wikipedia.org/wiki/Literate_programming> for some background on
literate programming. TeX is written in such a style.
------
gfodor
I'm interested to see where this goes, but one of the problems with literate
programming is that it means that in addition to being a good programmer you
also need to be a good writer. In other words, if you are a good programmer
and a bad writer you are going to produce net bad work since your writing will
confuse people who might otherwise have understood the code on its own, and if
you are a good writer and a bad programmer there is a chance this will make it
harder for others to realize your implementation sucks since it may be dressed
in the most brillant, clear prose possible. (Of course one can make the
argument that brilliant clear prose is a sign of brilliant clear thinking and
hence code, but I am not so sure.) It also opens up the door for an entirely
new dimension in code reviews. Imagine your resident grammar nazi jumping into
code reviews now to perform edits to paragraphs upon paragraphs of comments.
(This is probably the same person who agonizes over class names, so maybe
you're already used to this :))
I've found that there seems to be a decent correlation with writing skills and
programming skills, but that's far from a fact and I've worked with people in
every spot in the 2x2 skills matrix.
~~~
jashkenas
> This is probably the same person who agonizes over
> class names, so maybe you're already used to this
Bingo. I think that for programming languages where clarity is already a
common virtue (think, Ruby, Python, Clojure) -- folks already need to be
writing clearly when they program: making the code clear to read, naming
variables and functions very well, doing logical ordering of sections of code
etc. That's already about halfway towards what you would do if you were
accompanying the code with a bit of essay. I think the two worlds aren't as
far apart as one might think.
------
protez
Don't update now!
I just updated it to checkout .litcoffee compilation and it worked out okay.
But suddenly, all my express.js apps stopped working due to the exception from
its connect module. I downgraded coffee to the previous version and all things
turned fine again. It seems the package needs fixes. I like most parts of node
except these surprising interconnections.
~~~
nadaviv
Try diffing the compiled source for 1.4 and 1.5.
~~~
kcbanner
rofl
------
Tichy
"you can write it as a Markdown document — a document that also happens to be
executable CoffeeScript code"
Am I missing something or is that line literally the only
explanation/documentation given as to what is literate CoffeeScript and how to
use it? Must admit I have no idea how to use it now.
~~~
talklittle
The next sentence includes links to bits of the compiler in literate
CoffeeScript. <https://gist.github.com/jashkenas/3fc3c1a8b1009c00d9df>
It's writing a Markdown document, where the code blocks, indented 4 spaces,
are executed. And the rest of the document is essentially nicely-formatted
comments in the form of Markdown.
~~~
mosselman
Thanks for explaining.
------
jahewson
Literate programming - why have terse comments when you can have a verbose and
rambling narrative? I actually tried reading the TeX source code, and it was
damn near impossible.
~~~
ajross
I mostly agree. For 95% of code, it's simply a waste of time to try to
"document" it like this. For the handful of situations where elaborate
documentation wants to be stored with code (which basically means "automatic
reference doc generation from API definitions") we have tools like Doxygen
already that work well.
But there does remain that tiny subset of code that is so complicated that it
can only be explained in prose. This includes things like, say, DCT
implementations, tight SIMD assembly, complicated threadsafety architectures,
oddball parser context dependency rules, etc... I can see wanting to read this
stuff in a "literate" environment.
But... is _anything_ you might use Coffeescript for going to contain code like
that? I can't think of any good candidates offhand, beyond (perhaps) the
coffeescript compiler itself...
------
quii
It's a cool idea, but I am struggling to understand why this is better than
say BDD; which not only effectively documents what your code does but also
verifies it does what it says it should do.
~~~
chrisdevereux
That was my though too. Are there any languages that let you write specs
inline with the functions/classes that they test? _That_ would be cool.
Edit: Having thought about it for a second, this could easily be done in most
languages (with some preprocessing to strip out for deployment. Super easy in
C-based languages). Not a convention I see anyone follow, though.
~~~
phpnode
i'm working on a project that does this with doc comments. Basically you add a
doc block like this:
###
@it should return true
expect(foo()).to.be.true
###
foo = -> true
and the preprocessor extracts the test from the comment and associates it with
the foo declaration.
------
agentultra
It just needs the ability to include source blocks in other source blocks and
a way to tell the "compiler," how to organize the output files...
For that I just use babel in org-mode... but you have to be an emacs person
for that. I'm sure there are other literate systems (like the original:
<http://www-cs-faculty.stanford.edu/~uno/cweb.html>)
------
franze
has anyone figured out what the syntax for
You can now loop over an array backwards,
without having to manually deal with the indexes.
is?
~~~
jashkenas
Sorry, I should add that to the changelog. It's just like looping over a range
downwards (or by arbitrary increments).
for item in list by -1
~~~
int3
Very nice! I've wanted this for a long time.
------
arianvanp
I'm not sure if I agree with making comments a first-class citizen in a
programming language.
Good code documents itself and should be first-class. Comments should be there
to clarify certain decisions that you've made while writing code.
Handing out citizenship to the comments just clutters the code flow and will
stimulate bad commenting behaviour.
I really don't see any pros for 'executable' markdown at the moment, apart
from being cool.
------
jimjeffers
Is anyone planning on some sort of support for .litcoffee in docco? It'd be
awesome if docco could compile the markdown documents into HTML files and keep
that swell dropdown navigation in the top right hand corner available.
Generating docs as I was in my earlier projects seems to be the only gap.
Otherwise I'm using litcoffee on a current client project and really loving
it!
------
andyjohnson0
I remember getting quite interested in literate programming back in the early
nineties, but I've barely heard anything about it since then.
Has anyone (apart from Knuth) used LP for any real work of significant size?
What was the justification of doing the, presumably substantial, work to add
literate programming to coffescript?
------
zbowling
I'm cofused by what "executable" means in this context with markdown. Is this
some kind of documentation support for coffeescript?
or is the code executable in the blocks in the markdown script and literate
mode is a way to invoke "documentation" basically if the file is a .litcoffee
document instead of a .coffee document?
~~~
jashkenas
If the file extension is ".litcoffee", it means that you're writing a Markdown
document, where the embedded bits of indented code are the executable part of
the program.
Basically, it just inverts the usual power relationship in a piece of source
code, where the code is primary and aligned along the left edge, and the
comments are set off in little fenced blocks. Now the comments are primary,
and the code is indented in little blocks -- but it's still the same
executable piece of source code.
The neat bit is when you have it all working properly in a text editor, your
prose being highlighted as Markdown, and your code being highlighted as
CoffeeScript.
~~~
mikeknoop
After reading the docs and comments, this was the comment/explanation that
made the light-bulb click.
------
pkorzeniewski
Very interesting idea, great for creating architecture outline, generating
documentation and understanding code.
------
ww520
Interesting development with the literate programming. Certainly a bold move.
Kudos for trying some new!
~~~
sambeau
I had to hand all my functional programming homework in in literate style in
1992 so 'New' is relative :-)
~~~
tanepiper
New to web development
------
arvidkahl
Could you please state the reasons for this change: 'cannot return a value
from a constructor'. Besides breaking working code, this feels more like an
added restriction than an added feature. I'd love to know why this is in 1.5.
Thanks!
~~~
jashkenas
It _is_ a restriction, and I'm a bit torn about it.
Apart from fixing bugs where you'd use a CoffeeScript `class` to extend from a
native object, returning "other" values from a constructor is a bad idea
because it makes your code lie.
widget = new Widget
In CoffeeScript 1.5+, unless you go to great lengths to get around it, that's
always going to return a new Widget -- in JavaScript, that could return a
Dongle, an old and already used Widget, or anything else (that's not a
primitive). If you want a function that maybe returns a new Widget, and maybe
an old one, just use a normal function, not a constructor:
widget = Widget.lookup()
------
transfire
Ruby's had something like this for awhile --a test framework called
[QED](<http://rubyworks.github.com/qed>). Made for testing, but technically it
could be used for anything.
------
ms123
I love that executable markdown style. I created a project some time ago that
does just that. For the records, here it is:
<https://github.com/mikaa123/lilp>
------
tbe
Cool idea to use Markdown's code snippet feature for literate programming :)
I guess you could do this in any language by incorporating something like the
following into your project's makefile:
sed '/^\t/!d; s/^\t//'
------
vectorpush
This is cool. IMO, most comments are pretty superfluous, but I think this
would be awesome for setting up visual groupings of related code sections.
------
nateabele
Oh.
I was really hoping 'literate' meant you could read the _code_.
------
lastbookworm
I can see this being useful for writing tutorials or even a book. So many
ideas for educational material swirling around in my head.
------
namuol
I'm not sure this is a good direction.
We all want better documentation, but the problem with comments is that they
can _lie_.
~~~
libria
Commenting is still not required, AFAICT.
Are you positing that better formatting encourages worse
comments/documentation? I'm of the opinion that their presentation is
independent of their quality.
~~~
namuol
I fear that _emphasis_ of documentation, to this extent, encourages us to read
comments first and foremost, rather than the code.
I realize that the emphasis could be perceived the other way around (on the
code, rather than the comments), but the author's intention seems to be the
other way around.
------
nathell
Which editor is this in the screenshot?
~~~
oal
Sublime Text 2 with the Soda dark theme.
Edit: Link to Soda theme on Github: <https://github.com/buymeasoda/soda-theme>
~~~
mosselman
Do you also happen to know which colour scheme that is?
~~~
jashkenas
Brilliance Black (42), a Halloween edition from several years back.
~~~
mosselman
Thanks.
I can't seem to be able to find a version that looks like the screenshot
though. Maybe you have a link? That would be great!
------
acedip
This is brilliant. love it.
------
coldtea
Shouldn't they be working on the goddamn Source Map support instead of
literate gimmicks?
~~~
jashkenas
A bit of that is in this release as well, thanks to Jason Walton.
[https://github.com/jashkenas/coffee-
script/blob/master/src/g...](https://github.com/jashkenas/coffee-
script/blob/master/src/grammar.coffee#L43-L56)
Source location information is now preserved through the parse, although it's
not yet being emitted as a source map just yet. If you want to use
CoffeeScript source maps today, feel free to use the Redux compiler, which
does them just fine. Michael also has some other tools to make source maps
even more convenient: <https://github.com/michaelficarra/commonjs-everywhere>
That said, I don't personally care for source maps all that much, placing a
higher priority on generating readable and straightforward JavaScript output.
Things like Literate CoffeeScript rank higher on the priority list.
~~~
coldtea
Thanks for the response (well, given my tone).
But I think this literate thing never caught on for a reason. And I'd say the
reason is marginal returns (over comments and clean code) and too much fuss.
As for the "entitlement", it's because people have been saying source maps
would came to CS for 2 years now, and I've seen nothing related yet (even the
crowd-funded project is not there).
------
camus
So when does coffee-script becomes independent from javascript and gets its
own CS->machine code compiler ?
~~~
NoahTheDuke
I would guess when Javascript dies. So, never.
------
wildchild
Cool, imperative code diluted with some poetry looks much better.
------
Mahn
This won't be a popular comment, but I have to admit I never really liked
CoffeeScript. I've tried to like it, and it does seem more concise, but what's
the point, it always seems clearer (to me) what something written in vanilla
javascript is doing. I don't know, I guess I haven't done enough Python.
~~~
mratzloff
It won't be a popular comment because it's not germane to the conversation at
hand.
~~~
Mahn
We have all sorts of discussions here on HN that derail a bit from the topic
of the main article linked, e.g. discussing smart TVs in a thread about webOS;
I don't see how this is negative as long as there is still connection.
~~~
Cushman
This isn't negative so much as irrelevant; CoffeeScript has been around long
enough that we've had most of the discussion around what people like and don't
like about it. If you don't like it, that's fine, but there's not a lot to
talk about.
| {
"pile_set_name": "HackerNews"
} |
I ported a JavaScript app to Dart - zetalabs
http://blog.sethladd.com/2014/05/i-ported-javascript-app-to-dart-heres.html
======
supporting
It's a little bit sad that Google pays people to have their entire job be to
astroturf for Dart with blog posts like this one.
Being a "developer advocate" is one thing, writing official documentation,
answering questions on forums, whatever — but "I ported a JavaScript app to
Dart. Here's what I learned." Seriously? More like "I'm on the Dart Team. I
ported a JavaScript app to Dart because it's my job."
Open source shouldn't need to be juiced with paid posts. If you scroll down to
the "Lessons learned" section at the end, it really strikes to the core of
what being disingenuous is all about.
~~~
spankalee
Your definition of astroturfing doesn't seem to match mine. Seth states
several times on his blog that he works on Dart developer relations at Google.
Blogging about Dart is part of his job, and in no way is he being deceiving
about that fact.
__I on the other hand am simply an engineer on the Dart team, as I must
disclaim in this context.
~~~
plorkyeran
Where does he mention that other than the sidebar? I completely missed the
fact that he works for Google while reading the article since it's never
mentioned in the body of the article itself, and the article is written as if
he's completely new to Dart and trying it out for fun. If he's not trying to
present that image, then it's a really awful writing style.
~~~
vdaniuk
Do you imply a conflict of interest? If you do, please state it explicitly.
Your criticism as it stands doesn't add to the discussion and one could say
that your writing style is really awful, too.
Sidebar is not an unusual place to place author bio and it is definitely
enough for me.
~~~
plorkyeran
Okay, I'll state it explicitly: I feel that the article is actively deceptive
about the background of the author, which makes me distrust the content of the
article as well. The sidebar with his background is not visible on the first
page of the content, so I would have had to interrupt reading the article to
read it.
I say that it is a bad writing style because I am assuming good faith. I
assume that the positioning of his bio is merely unfortunate and not
specifically chosen to reduce the number of people who read it, and that the
article is not actually trying to deceive me about the author. I think it is
obvious that coming across as astroturfing to some subset of your readers when
that isn't what you're trying to do is a bad thing.
~~~
magicalist
> _I assume that the positioning of his bio is merely unfortunate and not
> specifically chosen to reduce the number of people who read it, and that the
> article is not actually trying to deceive me about the author_
Oh please. Most blog authors barely manage to fill out an About page, let
alone a sidebar and a big subheading at the top of the page that says who
their employer is. If you're going to claim to be "assuming good faith", how
about actually doing so?
------
tyleregeto
I've been porting a larger application from JS to Dart, and I can repeat a lot
of what Seth says in this article. The experience has been generally very
good. While porting I also discovered a couple bugs that existed in the
original code base for a very long time.
I do keep flip-flopping on whether I'm going to seriously commit to it though.
I really like Dart, my code is safer, cleaner, and more maintainable. But it
just doesn't _feel_ like it has much momentum yet.
~~~
spankalee
I'm not sure what I can say about Dart adoption internal to Google, so I'll be
conservative and say that it's going well in my opinion. I hope we can speak
more about it at some point. Language adoption is a very gradual thing at
first, so I'm not at all surprised that momentum isn't that apparent yet. The
community is very active though (join the email lists or G+ if you're not
already).
Rest assured, we're very committed to Dart.
------
jlongster
I somewhat skimmed the article, but it doesn't mention the fact the JavaScript
is on the verge of getting modules. Unless you really love Dart, you can fix
most of these problems by using something like the ES6 module transpiler:
[https://github.com/square/es6-module-
transpiler](https://github.com/square/es6-module-transpiler). Even better, in
then next few years JavaScript will start adapting these natively, solving
many of the organizational issues laid out in this post.
~~~
sethladd
Thanks for the feedback. I did point out, in the end of the article, that some
of the techniques (e.g. libraries, futures) aren't impossible in JavaScript.
And I'm really happy to hear they might be coming to a future version of
JavaScript (everyone should have modules and promises!). Part of the point of
the article is that Dart has these features now.
~~~
Excavator
Thought you'd might want to know that Promise¹ is available in Firefox &
Chrome since quite a while back and the spec for Modules can be found here:
Spec:
[http://wiki.ecmascript.org/doku.php?id=harmony:modules](http://wiki.ecmascript.org/doku.php?id=harmony:modules)
1: [https://developer.mozilla.org/en-
US/docs/Web/JavaScript/Refe...](https://developer.mozilla.org/en-
US/docs/Web/JavaScript/Reference/Global_Objects/Promise)
~~~
sethladd
Yup, thanks! What's the story for modern browsers that don't yet support those
features? Do they both have polyfills?
As I mentioned in the article, libraries and futures aren't exclusive to Dart.
It's just that they are here _now_ for Dart and are compiled to JavaScript for
all modern browsers.
I think it's telling that the original author didn't use Promises or Modules.
Of course he could have, but we should ask, why didn't he?
~~~
jamesknelson
While a lot of browsers are still lacking pretty much any ES6 support, you can
get around this by using Traceur [1] to compile your ES6 javascript down to
ES5, and then using the es6-shim [2] for new ES6 libraries (Promise, etc.)
Given that you'll need to compile dart down to JS for it to run on _any_
browser (including chrome), and given that it seems unlikely Mozilla will ever
include a dart VM, I'd say that starting to learn ES6 now is probably a better
long-term bet than learning dart. Which isn't to say Dart isn't a nice
language...
[1] [https://github.com/google/traceur-
compiler](https://github.com/google/traceur-compiler)
[2]
[https://github.com/paulmillr/es6-shim](https://github.com/paulmillr/es6-shim)
------
todd8
I really like Dart. I'm surprised its not getting traction faster. Its a
language with few surprises, everything looks familiar and works as you'd
expect without gotcha's. While it doesn't feel cutting edge like Haskell or
retro on steroids like Clojure, it seems just right as a replacement for
Javascript.
~~~
k__
It just doesn't play well with JS, because it was designed as a JS
replacement, with an own Runtime, which kinda sucks...
~~~
spankalee
Care to be more specific? You can use JavaScript from Dart now, and we're
working on improving interop support. Feedback is welcome.
~~~
k__
Really?
When I last looked into it, there had to be some strange stuff to be done, to
call JS from Dart and Dart from JS.
It just didn't felt as naturally as in TypeScript or LiveScript.
------
robmcm
One of the early issues raised about Dart was that it's compiled nature would
make it hard to debug and was against the open source (easy to hack) nature of
JS on the web.
These days JS is more or less compiled, at least the file you get in your
browser is very different that what the developer saw.
This seems to improve the case of Dart and I wouldn't be surprised if we saw
more languages that compile down to JS in the near future.
We already have ES6 down to ES5 [http://addyosmani.com/blog/author-in-
es6-transpile-to-es5-as...](http://addyosmani.com/blog/author-in-
es6-transpile-to-es5-as-a-build-step-a-workflow-for-grunt/)
~~~
grifpete
Why would the compiled nature of Dart make it hard to debug? Presumably you
mean debug by someone other than the original developer?
~~~
robmcm
That and debugging the compiled JS assuming there was a bug in the compiler or
a browser quirk the compiler hadn't worked around.
This is me remembering the arguments not making it ;)
------
adamkochanowicz
I'm concerned that, however good Dart may be, the fact that zero browsers
intend on supporting it natively (and I understand they non-natively support
it via js compilation), the popularity is likely to escalate.
Wasn't the idea of Dart to replace JavaScript as the "lingua franca"? And we
still have to run Chromium to support it natively?
~~~
rdtsc
There is nothing preventing Google from release Dartium (Chrome with DartVM).
In the meantime it compiles to js and is not the first thing to do so.
In fact I think Google should release Google Chrome with Dart VM and then
accelerate a few of their apps (oh look much better Google maps!), before you
know it, Opera will get it as well.
~~~
pjmlp
It is marked as "in development" on the Chrome dashboard.
[http://www.chromestatus.com/features](http://www.chromestatus.com/features)
------
ryanolsonx
I really liked the direct comparisons in this article between the original
javascript and the dart code. It makes me want to learn more about dart and
about how I can use it in my personal (or work) projects.
It's doubtful that JavaScript will be widely replaced as the language of the
web. I can see a day where most web developers don't write JavaScript anymore
and languages that compile to JavaScript (like dart) will be widely adopted.
It's almost like the evolution of programming languages. If you take, for
example, the C programming language, people could write a lot of modern
applications using it but C++ and .NET has made things easier and more
maintainable. I think a similar thing could be said in the future about dart
(representing C++ or .NET) and JavaScript (representing C).
------
Nilzor
Tl;dr News flash: static typing is a good thing!
------
Nekorosu
The article clearly shows that Dart doesn't stand too far from vanilla JS.
Conceptually it's on the same level. Futures? Really? Functional reactive
programming or communicating sequential processes solve the async problem a
lot better than futures. FRP is available in vanilla JS (take a look at
bacon.js) and CSP are part of ClojureScript which by the way exists because
it's a good tool not because some megacorp throws it's money to push it to the
masses.
------
Nemcue
Sure — I guess it's nice that Dart comes with a lot of useful things built in.
Outside of Dart we already have pretty good solutions for Promises (with
Promise A+ compatible libs; Q, RSVP etc) and modules (ES6 modules, Require
etc), which /kind of/ makes those points moot.
Which leaves type checking and autocompletion as the big benefit. Which are
nice, I guess.
~~~
sethladd
I've clarified at the bottom of the post that JavaScript is probably going to
get some of these features in the future. One of the points I was trying to
make was Dart has these features _now_ (and because Dart compiles to JS, it
means I can deploy these features now).
The other question we should ask is, why didn't the original author use those
new shiny JS features in his app? He's a crazy smart developer. My hypothesis:
because the out-of-the-box dev experience doesn't include modules, promises,
etc, there's a higher barrier to using the new shiny JS features because the
developer needs to first A) know about them B) find the right polyfill.
Thoughts?
~~~
Nemcue
As a counterweight: I'm not a crazy smart developer, and I use those features.
In production. Now.
I'm not going to argue against your point though, because I think you are
right. One needs to have some grasp of the JS ecosystem to know what libraries
to use. And as I said, it's nice that Dart has made that choice for you.
But that said, poking around a bit in the Dart documentation I think it's
interesting that "Futures" are seemingly /not/ entirely interoperable with the
de facto standard of Promises (A+) in JS.
~~~
sethladd
I don't know enough of the intimate details of Promises to know if Future ==
Promise. I hope I didn't make it sound like Future == Promise, but I do think
they are quite similar in intention.
Can you expand on why you think Futures aren't entirely interoperable with
Promises? Also, which specific implementation of promises? (what's the link to
the promises that you're talking about?)
------
bla2
I wonder how the size of the dart-generated js compares to the size of the
original js.
~~~
sethladd
Good question, I'll try to add that. Maybe more importantly, what is the
startup time for the two versions?
~~~
Nemcue
Would also be interesting: Perform some operations, compare flame charts from
Chrome developer tools.
~~~
sethladd
Also a good idea! Gah, I need more time.
------
x86_64Ubuntu
Why can't I comment on the blog? Anyway, I was going to say.
>Slowly but surely the JS worlds moves toward looking like the Flex/AS3 of
years before.
~~~
sethladd
Hm, the blog has G+ comments on it. Apologies if that's not working. You can
leave comments here, too :)
------
dkarapetyan
All that code and one bug? Either the original programmer is a genius or Dart
isn't as great as it seems.
~~~
sethladd
The original developer is really good. Also, the original app is small-ish, so
I really wouldn't expect too many (or any) bugs in the original app.
| {
"pile_set_name": "HackerNews"
} |
Ask HN: How to prevent spamming on JavaScript frontend logging API? - chengyinliu
Hello HN,<p>We are design an analytics logging API for our JS application. What bothers us is spamming prevention. I can't come up with a better way than rate limiting to prevent someone else use a fake client and spam the API. Is there something I missed here? What are some best practices on this? What do generic services, say Google Analytics, solve the problem?<p>Thank you.
======
pindi
I often have the same concern. What I ended up doing was issuing a token to
each user, and verifying the token on the logging endpoint. That way, if
someone decided to fill the logs with spam, we can easily delete all the
events from that token. From what I've seen, major analytics services seem to
do nothing at all to prevent a client from pretending to be any user, so this
kind of abuse is probably rare enough that it shouldn't be a big concern. Some
basic rate limiting is always a good idea, though.
~~~
nivla
But what is preventing a spammer who has made up his mind to reverse engineer
your analytics code from faking the tokens? If its only about filtering,
wouldn't it be easier and more effective to do them via ip-addresses?
>so this kind of abuse is probably rare enough that it shouldn't be a big
concern
Yes I would agree too that its pretty rare since webmasters are more cautious
around fishy looking sites. However, there were 2-3 instances I noticed
someone spammed a referral into my Google Analytics data.
~~~
pindi
The tokens are cryptographically signed with a shared secret between the main
server and analytics server. So an attacker can't forge another user's token.
IP addresses would be a good solution also, especially if you need to track
anonymous users.
~~~
nivla
Ahh I see, thats a good trick, the use of cryptographic signatures din't
crossed my mind.
| {
"pile_set_name": "HackerNews"
} |
Show HN: An Open crypto exchange and a stable coin - ataleeq
I am running a startup to Launch
1. An open source and decentralised coin pegged to CPI and global economy to fight devaluations, manipulations, news and demand/supply effect on currency value.<p>2. OPEx, a crypto exchange where experienced trader can share their recipe and beginners can use it by paying fee or sharing profit.<p>Looking form HN community to discuss and guide me on this. If you feel passionate and want to play your role, you are more than welcome. You can visit http://www.shield.support and validate idea for yourself being trader, developer or investor.<p>You can send your suggestions on twitter @shieldspprt or can email at [email protected].<p>tags: bitcoin, cryptocurrency, exchange, CPI, global economy, decentralised, DeFi, startups, YC2020,
======
verdverm
Not a YC company, a startupschool participant
[https://www.startupschool.org/companies/1-6uM5NohJbTOg](https://www.startupschool.org/companies/1-6uM5NohJbTOg)
| {
"pile_set_name": "HackerNews"
} |
Subsets and Splits