text
stringlengths 44
776k
| meta
dict |
---|---|
Ask HN: Is Fulfillment by Amazon worth it just to target Prime members? - p_k
My company already works with a fulfillment company and we're already a seller on Amazon.<p>I'm considering of using Fulfillment by Amazon just to target Prime members for our Amazon listings. Not surprisingly, Prime members spend more.<p>Was anyone in the same boat?<p>Was there any significant improvements in your profits over selling as a regular seller on Amazon?
======
robdoherty2
I used to work for an Amazon merchant doing roughly $2 million /year in
sales-- not big, but big enough that this was not a side business.
In our case, FBA (Fulfillment by Amazon) was useful for a few things, but
increased profit was not one of them.
On the contrary, we used FBA to sell 'hot' items quickly, which in turn led to
boosts of higher customer feedback for our merchant account.
Ultimately though, the increased traffic you might get with your FBA items is
seriously offset by the increased competition you will get from other
merchants and even Amazon itself. And After Amazon takes the cut for doing the
fulfillment, you are left with pennies per sale in profit if you are lucky
enough to get the volume sales.
Another unexpected negative aspect of using FBA is that occasionally Amazon
fulfillment centers claim that there is "something wrong" with your goods or
they are "damaged in shipment" and they refuse to store them or ship them. You
have to eat the cost in these cases.
I think FBA can be useful if you are a merchant with a specific product line
that won't experience direct competition, or if you are a small merchant
trying to build some traction.
On a side note, I think FBA is a brilliant win-win strategy for Amazon: they
get paid for storing stuff whether it sells or not, and they get merchants to
test out product lines at ZERO cost to themselves.
------
27182818284
I was shocked, after signing up for Prime's trial, how it instantly affected
my buying. Nowadays, I rarely do items that don't have the Prime logo next to
them. My friends are the same way. There is a big psychological component to
it that sometimes overrides price outright. I can't explain it well. It is
like in addition to knowing shipping will be free and quick, the Prime logo
comforts a little bit. Like you trust the order more than if it is sold by
FlyByNight LLC, regardless of whether or not it really is that company in the
end.
~~~
robdoherty2
I'll add to this by saying that 99/100 purchases I make on Amazon must be
Amazon Prime available-- simply because I've been burned by 3rd party
merchants.
Amazon Prime is certainly best for consumers-- but for merchants it is very
difficult to turn a profit.
------
Lasher
Strongly recommend fulfillment by Amazon if you can make the numbers work. As
a long time Prime member the only time I will ever consider non-prime is when
there are no 'Prime' options available, or there is a massive cost differences
which is rare other than on very hard to find product.
------
srdev
In my case, items fulfilled by Amazon get my first look and a small price
premium, due to convenience in shipping. Just another data point from the
customer side.
------
livestyle
If your a MFG ..go for it...if not be very careful, I am aware of more than
one occassion when the big A will cut out the middle man if they have a hot
seller.
~~~
p_k
We are the original manufacturer of a unique product. Based on the answers
here, I think we'll go ahead with FbA.
Thanks guys!
~~~
whichdan
Just adding my 2c, even before Prime, I would gladly pay a few dollars more
for FbA because I knew the shipping dates would be reliable and it would count
toward the $25 free shipping minimum.
| {
"pile_set_name": "HackerNews"
} |
IOS 5 brings the innovations - guelo
http://www.youtube.com/watch?v=Gq-e0getf4M
======
guelo
It's not copying, really. It's just good design. Except when it's Samsung,
then it's stealing.
| {
"pile_set_name": "HackerNews"
} |
The internet has made defensive writers of us all - dkarapetyan
https://pchiusano.github.io/2014-10-11/defensive-writing.html
======
kornish
A great quote from the article:
"The defensive writing style also encourages another sort of ugliness, which
is that “avoiding saying something wrong” becomes a primary focus of the
writing, rather than communicating or exploring ideas which the author might
himself be unsure of."
One thing that's great about Gwern Branwen's writing is that Gwern's essays
are accompanied by a "belief tag". That tag gives the ability to say "This
essay is exploratory; I don't have a high degree of confidence that it's
canonically true." Details here: [http://www.gwern.net/About#belief-
tags](http://www.gwern.net/About#belief-tags)
Having the option to dissociate ideas from one's person can give a lot of
freedom to put contrarian ideas out there and spark debate in search of the
Truth without fearing personal attack. Indeed, one of the things that I
consider to make a healthy company culture is that ideas can be criticized
without the people behind the ideas feeling criticized themselves.
~~~
Florin_Andrei
It's one of the surprising side-effects of the Internet that anything
whatsoever you might say in a public forum can potentially offend someone,
somewhere, somehow.
~~~
proksoup
I recently watched Slavoj Zizek speak about how political correctness is a
more complete totalitarianism.
[https://www.youtube.com/watch?v=tndXr-
oQxxA](https://www.youtube.com/watch?v=tndXr-oQxxA)
I am optimistic the video has some relevance to this discussion.
~~~
m52go
Mises recently published an excellent article on this topic, titled "PC is
About Control, Not Etiquette"
[https://mises.org/library/pc-about-control-not-
etiquette-0](https://mises.org/library/pc-about-control-not-etiquette-0)
~~~
jxramos
"PC is best understood as propaganda" what a bold statement! I like this
critical exploration of the whole concept of PC, pretty fascinating.
------
grellas
I find internet writing to be stimulating in its own way precisely because it
is a sort of dash-off writing that does not need to be as precise or exact as
would be needed if you were seeking to meet professional standards.
To wit, in law, even in formal contexts, lawyers dish off all sorts of slop in
legal briefs, etc. but this really is sub-standard lawyering. To do your job
right, you need to meet standards of excellence in making sure you have sound
analysis, careful factual recitation, and skilled application of law to facts
as you make arguments or seek to achieve some other professional writing goal.
This is true as well in less formal professional settings such as writing
emails/letters to clients. It may not absolutely matter what you say in terms
of precision if a client is not likely to pick up the fine points but it
really does matter in terms of maintaining a consistent pride in your
professional work. Slop is slop and, when people will evaluate you by how well
you are representing a client, it is critical not to be slipshod in your
writing.
When writing on the internet, in contrast, you of course want to avoid putting
out slop there as well but a lot less precision is needed to make your points.
If you make a legal point, it is implied by context that you are making a
statement that may not be accurate down to the finest level of detail, that
you may be simplifying, or generalizing, or simply venting an opinion that the
law may or may not support. Law in itself will vary, even greatly, from
jurisdiction to jurisdiction, and this means that much of what you may be
saying is really setting forth broad principles while avoiding a specific
application to a given case. For these purposes, there is no need to be
defensive, not in the slightest. And I certainly try not to be. Am I ever
wrong? Of course, on occasion, yes - I have had a doozie or two in my time,
perhaps a number of them. But you try to feel secure about such lapses,
knowing that we all err occasionally, and try as a whole to conform to an
overall solid record of being as accurate and insightful as possible. While
you get the occasional harsh attack, for the most part I have found that
people will be charitable knowing that you may have spent no more than 10 or
15 minutes trying to address a sometimes complex topic. Better to give people
the benefit of your insights if it is an area where you can do so than it is
to leave something in the internet conversion that is wrong or off simply
hanging out there. That is a great benefit of the internet. Back in the day,
you had to dig deep to find out what people with expertise in their field were
thinking and access to this was severely limited. The internet has changed all
that. There may be a lot of slop out there but there are also many gems that
are there for the taking. As writers, we should be ready to share, if not
gems, at least our best thoughts on topics where people might find them
helpful.
If I were to stray from my professional topics, then maybe I would be more
defensive as well. But the author of this piece emphasizes how the internet
may hinder academic writing and I see legal writing as at least broadly
similar. So, my experience differs pretty markedly.
Just my two cents.
------
WalterBright
One of the common methods people use to attack posts with ideas they don't
agree with is to take the prose literally. This is much like the "work to
rule" form of protest that unions use, and the "letter of the law" rather than
the spirit of it, and of course all those "zero tolerance" policies.
~~~
jsprogrammer
One approach to resolving this would be to not make statements that are
literally not true and to correct statements that are literally not true to
not be literally not true.
~~~
hluska
That creates bland writing and the process of correcting every little
deviation is so pedantic that it provides a strong incentive to stop writing.
Both of these problems result in a world where we learn less, grow less, and
take fewer risks with our writing.
What is wrong with truly being charitable, assuming the absolute best
intentions and giving writers the license to deviate a little to let them make
a point?
~~~
jsprogrammer
Nothing is wrong with being charitable. Some statements are just wrong, or
there is no assumption of best intentions that can satisfy the claims being
made.
What is wrong with not writing blatantly false statements or calling out wrong
statements for being wrong (or, asking for a correction)?
There is a reason compilers cannot take arbitrary text and produce the program
you are 'actually' thinking of.
~~~
khedoros
"Perfect is the enemy of good", perfection isn't always necessary for
effective communication, and the perceived intent of the writer usually
matters more. If you think the writer is trying to mislead someone, or they
said something that you think will lead to misunderstandings of something
important, then call them out or ask for a correction. I took the post as
being more about "the writer didn't cover all the edge cases" than about "the
writer wrote something intentionally false/wrong/misleading".
~~~
hluska
Very nicely written!! :)
------
OvidStavrica
I am fairly new to Y-Combinators Hacker News. Unfortunately, I've run head-
long into this very issue on this forum. It ruins the experience.
There is value in communicating figuratively and/or with metaphors. When
creatively solving problems, looking for trends or drawing a hypothesis out of
the ether, it is often desirable to avoid specifics.
In my opinion, one's inability to comprehend and respond to any given
statement at multiple levels limits their upward mobility.
~~~
linkregister
I think that HN is one of the rare forums where this article doesn't apply.
Usually the top comment in a particular thread contains the least hedges and
most enthusiastically espouses a particular opinion. The comment is not
usually overly bombastic (though it sometimes is). The comment almost always
adopts the majority opinion on HN.
Posting a comment not aligned with the majority opinion carries a risk of at
most -4 points. I assert that downvotes more often correlate with disagreement
than failure to abide by the standards of discourse. Since the penalty is
limited, _that 's okay_.
Therefore, if visibility and replies are your goal (who wants to comment
without interesting replies?), then it's best to write boldly and concisely
with a minimum of hedges.
~~~
scrollaway
It's not just the downvotes. They don't help, but it's also the responses.
There is a major psychological effect from getting _attacked_ for stating an
opinion you might hold to heart. As an avid reddit/hn commenter, I feel it
often.
I don't really care if someone disagrees with me on something minor, but if
it's something I truly believe is important and I see people disagreeing with
it left and right and downvoting anything in line with that opinion, it makes
me feel weakness and despair.
Weakness: I am overwhelmed by the people disagreeing with me. I can't answer
everybody. Not because I don't have the arguments but because it's pointless,
won't lead anywhere and will achieve nothing but make me look insane. The
"hivemind" effects makes widely-held opinions even stronger and minority
opinions even weaker. Alone, I am powerless to counter that.
Despair: Let's say someone thinks gays/black/women/whatever shouldn't [have
some human right]. You truly believe that's wrong. If someone says to me "I
think women shouldn't have the right to vote" and is impossible to convince, I
feel pretty awful about it. I feel like that person is contributing in making
the world worse, and I live in that world.
Now what if that's not about some minority-held opinion, but about something
far more widespear? What if it's, for example, similarly insensitive and
disgusting comments about islam/muslims? I don't just feel awful being around
that person, I feel crushed by the amount of people who would agree with it.
I'm quite afraid of what happens when such opinion is widely held. I feel like
I'm looking at a lifetime of awful and I feel crushed by it.
My fiancée is a muslim. This is empathy kicking in. Not everything affects me
that way, but I'm far less likely to comment on something I don't hold to
heart.
But yes, I do enjoy HN because the hivemind effect is far more limited. The
hidden downvotes and sorting algorithms are a million times better than
Reddit's. It's not ideal, but it's still an excellent place to have discussion
and a "good enough" place to have debates.
~~~
TeMPOraL
Hey, thanks for opening up!
Yes, there is a psychological risk you take when you state an opinion on a
topic that is important to you. The more important, the more disagreement
hurts.
There isn't much one can do about opinions of others. We all try to keep
things civil here, but it doesn't always work. Not immediately, at least. I
remember the discussions after the last Paris attack; some comments were awful
and literally heartbreaking, and I too was feeling a mix of anger and despair.
> _What if it 's, for example, similarly insensitive and disgusting comments
> about islam/muslims? I don't just feel awful being around that person, I
> feel crushed by the amount of people who would agree with it. I'm quite
> afraid of what happens when such opinion is widely held. I feel like I'm
> looking at a lifetime of awful and I feel crushed by it._
I recognize that feeling. Frankly, I prefer hanging on HN so much because it
sometimes seems like one of the last few bastions of sanity on the planet.
Back after Paris, my Facebook feed was _literally_ breaking my heart, and it
was HN that reminded me that not everyone holds harmful beliefs.
Anyway, when you see knee-jerk attacks and hurtful behavior, I recommend
judicious use of the downvote and flag buttons. That's what they're for, and
they seem to work pretty well at keeping the discourse at proper level.
Best wishes to you and your fiancée. Stay strong!
------
resu_nimda
There is another perspective to this, which is to appreciate when others have
put forth a well-considered argument, and to sometimes relish in the challenge
of anticipating the rebuttals and shoring up leaks in your reasoning. Often I
will write something out that is somewhat critical, and then I'll think about
where people might attack it, and it ends up with a more measured position. I
love David Foster Wallace because he was the king of considering all facets of
an argument or idea so thoroughly that you can't tell which side he's on (and
neither could he).
I would say that the people with the problem that this article describes are
the good ones, and they are outnumbered and drowned out in our society by
"loudmouths" who don't obsessively consider their statements and who do
present their own thoughts and beliefs as factual and/or morally correct, so I
would say please do keep qualifying everything and even promoting that type of
"defensive" thinking. (The caveat is: don't cater to trolls and the willfully
uncharitable, and you have to decide for yourself whether someone is trolling
or genuinely trying to make an argument worth responding to [or both], it's
tough these days!)
I often think about qualifications like "I believe that...", "It is my opinion
that...", "I think that...". Some people argue that they're not necessary,
it's implied and obvious, it makes you seem weak, etc., but I disagree. I see
so many conflicts created by people communicating their beliefs as facts, and
it's not at all obvious that they acknowledge the viability of other opinions.
------
kelukelugames
As engineers, we are hardwired to nitpick. Coming up with counter-examples is
the default response. Instead, I try to say a supporting example first. This
helps me learn from other people's point of view.
Edit: Though I admit I don't do the best job on HN. :/
~~~
mhurron
> As engineers, we are hardwired to nitpick.
This is just not true. You are not 'hard wired' to react in such a specific
manner, and being an engineer does not mean ripping things apart.
You may be an engineer and choose to make a joke about something. You may be
an engineer and something may make you go off and learn about something you
didn't know about before. You may be an engineer and react in all sorts of
ways to soemthing, nitpick is only one possible outcome.
~~~
angersock
Your comment here rather illustrates the point, I think. :)
~~~
iamcurious
Interesting. So to convince people to disagree I should voice agreement?
------
jbob2000
Context is really important in communication. When speaking with someone face
to face, you pick up on vocal intonations and body language to give you
context to their speech. As well, knowing someone personally gives context to
what they are saying.
On the internet, you have none of those things, only text. I think we are
hitting a wall with our written languages because they can't adequately
communicate context. Emoticons are a great evolution in this respect, because
I can assign emotional context to some writing with a simple set of characters
:).
~~~
tacos
I liked this exact comment when I first read it back in 1985. At least now I
can upvote it. That's progress.
~~~
theseatoms
How about more voting dimensions?
------
rm_-rf_slash
I get the feeling a reactionary effect has been an increase of traffic on
anonymous image boards like 4chan. There the only value you have is in your
argument; there is no profile to judge.
People can say whatever they want there - and they certainly do - but if their
argument is invalid or poorly constructed it is shot down or ignored, which
serves as a lesson about the strength of a point made in a sterile
environment. Of course there are plenty of trolls and people who argue the
opposite point for the lulz, but that's just a cost of the medium.
There is no perfect form of communication. If there was, we'd all be using it.
~~~
hobs
Even though I have long ago given up imageboards, this is basically what I
came to say. If you think the problem of self censorship due to judgement is
the problem, publish anonymously and go hog wild.
I would much rather we have an anonymous set of opinions (because to be quite
honest, I will never meet most of you, or be able to connect on a personal
level, even if you are awesome) than a people looking to appeal to the crowd.
Unfortunately on 4chan et all this ends up making certain ideas the same as an
established figure, ideas(memes in this community) which propagate whether or
not they are good, just because they are established.
------
kobayashi
For those interested in the Steven Pinker article, "Why Academics Stink at
Writing", and are looking to avoid the paywall, here's the direct link to PDF
from his website:
[http://stevenpinker.com/files/pinker/files/why_academics_sti...](http://stevenpinker.com/files/pinker/files/why_academics_stink_at_writing.pdf?m=1412010988)
------
kelukelugames
I feel compelled to write about race issues because I am a person of color.
The issues directly impact my life. But I'm afraid because readers might be
pissed off enough to get me fired. :/
~~~
puredemo
Excuse my ignorance but do Asians "count" as people of color? I thought that
usually referred to Latinos and African Americans?
I know that university admissions PoC _don 't_ include Asians, in fact most
admissions standards are stacked against them.
~~~
kelukelugames
Yes, I am a person of color. The dictionary definition of people of color is
any non whites.
Edit: There is a long explanation of why Asians are sometimes excluded. The
short answer is racism against Asians.
~~~
puredemo
It sounds like you're saying it's fine to be racist against whites, but
offensive to do the same thing to your personal "in-group." Little ironic,
lol.
~~~
delecti
Often, when people are referring to racism, they're referring specifically to
Institutional racism. As a general rule (because everything is naturally full
of gray areas) it's considered to be impossible for whites to experience
institutional racism in countries where we're the majority. So while a PoC
could certainly be bigoted against whites, it wouldn't be accurate to say that
they're inflicting institutional racism on them.
~~~
13thLetter
" As a general rule (because everything is naturally full of gray areas) it's
considered to be impossible for whites to experience institutional racism in
countries where we're the majority."
Sorry, what? Considered by whom? Because we should have a word with these
people about how unimaginative they are.
For one thing, there are plenty of ways a member of group X could experience
institutional racism in an X-majority country. Perhaps they live in a
Y-dominated region of the country, or work for a Y-owned company, or group Y
has more political/economic/cultural power in society despite being a
minority, or they work for an organization whose policy is to hire and promote
Y preferentially.
More generally, I don't see the purpose of desperately insisting that
institutional racism _can never_ affect group X. To say that group X can be
harmed certainly doesn't mean that group Y isn't harmed _more_. It's a weird
and pointless linguistic game that harms the purpose of achieving equality of
opportunity by gratuitously driving away allies.
~~~
delecti
I know it's lazy to just link to a Wikipedia article on the subject, but I'm
going to anyway, because I'm too lazy to address all the things in your post
when you can do your own research.
[https://en.wikipedia.org/wiki/Institutional_racism](https://en.wikipedia.org/wiki/Institutional_racism)
The fact is, the occasional inconvenience on whites in America (as a
convenient example) from occasional pockets of bigoted blacks is fundamentally
different from the regular impact on blacks from the institutionalized racist
tendencies in society.
------
russellbeattie
Every YouTube creator has to do this now as well. They "apologize in advance"
or qualify just about everything they say to make sure they don't get
eviscerated in comments in case they make a mistake or were even just slightly
wrong. It's becoming so common, I see it seeping into every day conversation
in younger (like 13yo) kids. Not that everyone should go around spouting
ignorant opinion, but being terrified of being incorrect isn't good either.
------
danharaj
Different perspective: The cost of being wrong is overstated. The cost of
offending someone is overstated. Being told you're wrong isn't the worst thing
in the world. Being told you've said something offensive won't set you on
fire.
I used to be afraid of being told I was wrong or offensive. I took it
personally. Now I try to shrug it off and take it as legitimate feedback that
doesn't compromise my self-esteem or integrity. Being told you're wrong is
better than being wrong and never told you're wrong. Being told you're
offensive is enormously better than never being told who you're putting off
with your behavior. This holds _even when you take into account the
unjustified negative feedback_.
I try to avoid thinking less of someone for thinking I was wrong, or being
offended by what I've done. If I do that, I'm distorting my social reality: I
am penalizing people for telling me negative things. I am penalizing people
for caring that I'm wrong. I'm penalizing people for telling me that my
behavior hurts them. By protecting my ego, I would be hurting my relationships
with my peers and social groups.
This is not to say I'm not emotional about the responses I receive. I am quite
emotional. If I'm writing in a dry tone, I probably just woke up and don't
have access to all my senses yet. In fact I think writing without emotional
color is a form of defensiveness: It is yet another way for a writer to hedge
and separate themself from their opinion and their reader. I'd rather be close
to my words and my listeners than safe from criticism.
------
noonespecial
The phenomenon of the nitpicky someone-on-the-internet-is-wrong type comment
is so prevalent, I thing we need an acronym like tl;dr for it.
How about "willful misinterpretation; doesn't justify response"?
Wm;djr
~~~
zem
or bf;wr (bad faith; won't respond). i prefer the term "bad faith" since it
covers more cases than "wilful misinterpretation"; i've been using it to good
effect on facebook to disengage with malicious arguments, though i didn't
think of acronyming it.
~~~
openfuture
I like the ;wr part (won't respond) better than the ;djr (looks complex).
But we should make sure to decide on a single one* to use, I like a bit of
both: wm;wr cause of symmetry.
*maybe it would be more effective to have multiple and see if any of them catch on but I feel like it'll just cause unnessecary confusion
------
auganov
I've been thinking about that quiet a bit lately. My two random thoughts:
1\. I don't think it's strictly bad. More pressure to build coherent and
precise arguments is good. It can indeed yield unnecessarily verbose language
but I don't think it has to? Inserting qualifiers etc. everywhere is just the
most naive way of approaching it, like a newbie programmer using too many
nested if statements.
2\. We should assume all comments are charitable as well. They are no
different, they're just as likely to be misread.
~~~
ghaff
I do agree with 1. 2 is optimistic.
Depending on the situation, some "weasel words" (may, could, probably, etc.)
or other ways of adding some fuzz to your arguments are appropriate. But
formulating cogent and precise arguments with defensible data and references
is better when possible.
------
CurtMonash
I find this whole line of reasoning to be dubious, unimaginative, and sad.
Few people would call me a drab or unopinionated writer. Yet I write in a way
that I believe protects me from those who would seize on any misstep to
disparage me or even to sue for libel. And I've only slightly relaxed the
safeguards I used in my earliest business writing, when I was a stock analyst
subject to SEC regulation.
The keys to safe and ethical writing are:
1\. Don't lie or deliberately mislead. 2\. Be clear about the support for your
stated opinions -- including doubts or lack of support.
Or, equivalently, adhere as best you can to The Golden Rule of Opinionated
Non-Fiction Writing:
Give readers the tools to make informed decisions as to whether they should or
shouldn't accept what you say.
Beyond that, I'd say -- and this is something else that goes back to my
experience a stock analyst:
A. Your job isn't to tell readers what to decide. B. Rather, your job is to
make the best contribution you can to your readers' decision processes.
Ultimately, their decisions are their own responsibility -- and if they forget
that fact, then bad on them.
------
zkhalique
It comes back to what Marshall McLuhan said, "The Medium is the message":
[https://en.wikipedia.org/wiki/The_medium_is_the_message](https://en.wikipedia.org/wiki/The_medium_is_the_message)
I've come to realize is that the expected audience is a crucial aspect of the
communication requirements. On the internet, the audience is quite wide, and
disagreement / outrage usually leads more responses / activity than does
agreement and acceptance. I should say, in fact, that agreement leads to
further sharing, whereas disagreement leads to comments. Thus, an article that
some people agree with and some disagree with, gets re-shared and exposed to a
lot of controversy. That eventually finds its way into other outlets, whereby
each compelling article has a counter-article somewhere else in addition to
all the substance-free vitriolic comments.
What the internet has really done is stop letting us get away with echo
chambers, by breaking the walls between them. I think that is a good thing.
I should write a blog post on that :)
PS: This is not new. I think Bertrand Russell said: "The whole problem with
the world is that fools and fanatics are always so certain of themselves, but
wiser people so full of doubts." Now, those doubts are reinforced by the
threat of being shown wrong. I think it's quite an art to articulate
compelling arguments out of factually true statements, and even more of an art
to think several steps ahead, choose which statements to focus on in order to
draw certain reactions from an audience, and thus build receptivity to your
next statements. Daniel Dennett said: You should be able to understand your
opponent's arguments so well that you can say it back and they say "I wish I
had put it that way". And of course, to do that, we shouldn't just write, we
should also read, and all be exposed to the other arguments, instead of the
echo chambers that social networks make money off of:
[https://sealedministries.wordpress.com/2014/08/22/3-reasons-...](https://sealedministries.wordpress.com/2014/08/22/3-reasons-
you-should-read-those-with-whom-you-disagree/)
------
jonathaneunice
As an analyst, I perfected the art of defensive writing. We had to. Everyone
has a vested interest in everything you say. Every way that situations were
described, every opinion of what approaches or technologies will/won't work,
every product rating, every everything. If not a financial vested interest,
then an affinity interest. In the editorial process, we often talked about
"hardening" or "bulletproofing" the writing so it could withstand the slings
and arrows of relentless criticism.
------
madaxe_again
We have conversations like we're bickering over the interpretation of a
specification, and it's every bit as painful sometimes.
As the author says, wilful misinterpretation seems to be the name of the game,
and we almost all seem to be guilty of it from time to time.
It's easy to be a dick when you're not looking someone in the eyes.
------
elliotec
This is why I don't have comments on my own blog. I love discussion and
appreciate it where it belongs, like hacker news and reddit, but if I'm
writing it is my own outlet and I really don't want other people to see what
other people think about what I write. I prefer people come up with their own
opinions. If they want to discuss, they can message me.
------
bluetomcat
The way I see it, defensive writing is also a kind of self-indulgent,
narcissistic behaviour which seeks gratification from always being "correct"
and never being proven "wrong".
It requires a certain dose of humility to admit that something you've said was
wrong or to accept some parts of the opponent's opinion.
~~~
paulpauper
What if you don't know something for certain? Isn't some discretion needed?
Humility is being open the possibility of being wrong.
~~~
bluetomcat
One could say "correct me if I'm wrong" and continue with his thoughts instead
of precluding the possibility for outlining an interesting point of view
because of a small uncertainty.
------
Mendenhall
I think I have come to the conclusion "Don't feed the trolls". His post
reminds me of everyone who would correct grammar mistakes and try to make your
whole point look null because you missed a comma. I am not against correcting
grammar but when you try to paint an idea/person as ignorant because of that I
think it makes the attacker look foolish.
I engage the ones who have actual points to discuss or critique and just
ignore the trolls. I want to engage with educated people anyway to enrich my
own understanding, not bother defending myself from trolls. I figure and
educated reader will see through the troll anyway.
They hate lack of attention and will starve off or be seen by the masses as
annoying. I think the age of the troll is coming to an end. Society or at
least the more educated are starting to see it for what it is.
------
JeremyMorgan
So absolutely true. 2016 needs to be about taking it back a notch. Let's write
how we want to write and say what we want to say and start ignoring the loudly
offended. I'm serious here. Marginalize them instead of giving them power.
And I don't mean important things like racial or gender sensitivity, I just
mean the fringe little things that people pick apart and whine about.
I decided a while back when I write opinion pieces for my blog, you're getting
my opinion. Whine and complain, flame me in the comments, I don't care. You'll
get over it.
~~~
SolaceQuantum
> And I don't mean important things like racial or gender sensitivity, I just
> mean the fringe little things that people pick apart and whine about.
Isn't the common argument that certain aspects of racial or gender sensitivity
are considered "fringe little things that people pick apart and whine about"?
Like mansplaining, manspreading, halloween costumes about other races, the
recent World Fantasy Convention change from HP Lovecraft busts as their
trophies?
EDIT: If I'm actually doing exactly what is being argued against please let me
know. I'm actually just very curious where the "line" is between too-PC and
not-too-PC.
------
6stringmerc
Excellent bit of reflective writing and I'm certainly in agreement. For a
moment I thought this might be the basic gist of an article I'm working up
(re: message boards vs. article/content comments and notions of "community")
but glad to see it's a different angle, and complimentary. Guiding over to
Pinker's article is an immense help as well, because I'm really digging the
term "intellectually unscrupulous" as a nice description of some behaviors
I've seen online that are maddening.
------
ghaff
I'm not sure I buy this argument at all. I've certainly heard the argument
enough times that it's less critical to get everything right on the first pass
on the Web because you can always update it. I'm not a particular fan of this
mindset but it's out there. And I'd note it's the offline pubs that are far
more likely to still have people in a fact checker role--however vestigial.
And making errors of fact too many times was more than frowned upon at
traditional newspapers.
Having said that, a lot depends on the person and the context. I'm sure we
could all name columnists/bloggers who are more interested in being
controversial than in being correct; some predate online publishing though. On
the other hand, there are some contexts where you don't to state bald opinions
that you can't back up. I'm not convinced any of this has to do with Web or
non-Web.
~~~
squeaky-clean
> I'm sure we could all name columnists/bloggers who are more interested in
> being controversial than in being correct
I can't name any, so your comment is wrong, and I don't care to respond to the
rest of it, or even acknowledge that I read it. Of course I don't really mean
that, but I feel this is the kind of pedantry the article is talking about,
and not true factual incorrectness.
The Twitter account linked to only has 36.2K followers, not 39K, so that
paragraph is incorrect and stupid. ...Except the important detail here isn't
the exact number of followers, just that there's a large amount. If this were
a scientific paper on the number of followers various humor accounts gain,
then yeah, I would be all for ripping this apart. But it isn't, so it's
irrelevant.
Writing defensively is more than just being factual. It's being overly factual
where it doesn't really matter, because a vocal minority will likely yell at
you otherwise.
~~~
ghaff
A vocal minority will always yell at you. Fuck 'em and ignore 'em.
Edit: I've written in a relatively public sphere, albeit not at politics
level, for long enough to have a pretty thick skin. I realize not everyone is
in that position.
~~~
colmvp
Saying fuck'em and ignore'em is overly simplistic. Some people can do it. Most
can't. Even athletes who earn millions of dollars and have hundreds of
thousands of fans can get riled up by words from a single hater.
At a Social Anxiety meetup, I remember one person who could in great detail
recall a racist incident years back when a random person called the person a
racially derogative word. Sure, he could've just shrugged it off as it was
just a stranger and all. Yet it had a profound effect.
Just as much as words from strangers can inspire us to commit profound change,
so can words do great damage.
------
HelloNurse
Defensive writing is the straightforward consequence of increased exposure to
assholes. While in real life assholes are marginalized, and mainstream
discussions involve only reasonable people, Internet media enable assholes to
masquerade as normal people (they hide behind an ordinary appearance, with
little chance to know they will become irritating), increase their reach (they
can pick on anyone, anywhere and at any time, rather than irritating a limited
circle of acquaintances) and reduce social defenses against them (for
instance, they can easily return with a pseudonym after being banned from a
forum, which in itself is a much milder consequence for antisocial attitudes
than, say, being beaten up or losing their job in real life).
------
decisiveness
I'm unclear on whether the author is yearning to become blase about whether
his ideas are valid, criticizing his own writing, blaming his pandering
statements on imbeciles, or being cleverly ironic.
He laments having to tone down something he thinks might be interpreted
"uncharitably", and in the very next sentence blames it on those who commit
"uncharitable actions".
> At times I’ve been tempted to just turn off comments entirely on my blog,
> and just flat out avoid participating in comment threads on the web
I can relate to internet commenting frustrations, but this conveys real issues
with hearing any dissenting opinion. Turning off comments altogether on your
blog is like fearfully plugging your ears with your fingers to drown out
anything that might be valid criticism.
------
TazeTSchnitzel
It's sort of tangential, but knowing anything I write can be found by Googling
my real name has certainly made me write defensively. It's why my main Twitter
account is not public.
~~~
usrusr
It's why i occasionally send replies to facebook posts via IRC. Technically
quite the opposite of private, but there's a bit of a social convention that
it's OK to be a jerk on IRC and that is not only liberating, but also
surprisingly detrimental to the xkcd:386 pissing contests that are the root of
the defensiveness pchiusano is complaining about.
(disclaimer: IRC is any number of very different places so your experience may
vary - oh, how meta-on-topic! And while we're at it: "on IRC, all talk is
offensive, not defensive, IRC is part of the internet, therefore the whole
post is wrong")
Personally, this whole idea that excessively hedging arguments can be seen as
a bad thing is completely new to me. I do it all the time, from a simple AFAIK
to explicitly stating all the givens that might be attacked, just in case. I
state my opinion, or even just a write a little devil's-advocate "please try
to understand how it looks from that perspective" piece, not a properly
deweaselified wikipedia entry. But i do like to think of my usage as an
enabler for posting bold, speculative ideas. In scientific terms (i'm not a
scientist) i would rather be wrong with an outlandish causality hypothesis
than with misinterpreted correlation noise. Maybe the knowledge that i am a
linguistic offender will help me making the use of defensiveness more
deliberate and to the point.
------
intopieces
Since integrating shutup.css [0] in my browsers, my experience on websites
with comments sections has been much more pleasurable. It's very good at
recognizing comments sections and completely removing them.
I wish more sites disabled them and instead opened their blogs up to well-
thought out letters to the editor.
[0][https://stevenf.com/shutupcss/](https://stevenf.com/shutupcss/)
------
strictnein
In regards to hedges and weasel words: I never used more weasel words and
hedges than when I sold computers at Circuit City. I knew what the policies
should be and how we should be able to help customers, but I had no power to
actually make that happen so I used tons of should, may, probably, likely,
etc.
I hated it and it took me a while to unlearn that way of talking.
~~~
Lawtonfogle
I use a lot of what are called weasel words because the reality is that they
are the most accurate words. For example, someone buying a lottery ticket will
probably lose. I have trouble saying they definitely will because I know there
is a slim chance they will win. And if 1 in a 100,000,000+ is enough to make
me avoid saying definitely, imagine what chances such as 1 in a 1,000 or 1 in
a 100 do to my word choice.
------
beat
The internet has made offensive writers of many.
------
trhway
on the other side, that guaranteed exposure to unexpected and harshest
criticism is a great beauty of the Internet (it is like street fight with no
rules vs. strictly regulated fights under rules of some league) - if you're
not afraid (which is easily achieved by for example staying anonymous :) you
may explore many ideas much more boldly, deeply and widely and be able to hear
arguments and criticism of them not available otherwise elsewhere. As a
result, Internet taught me that i may be sometimes [fortunately very
infrequently] not right and/or just plainly mistaken. That was a huge personal
discovery :) , a bit painful, yet it is percolated even into offline life -
whenever i hear some crap at the meeting, etc. i now sometimes give myself a
second or even 2 to entertain the very improbable idea that that crap may be
worth something what i just don't see right at the moment :)
------
anotheryou
Programmers do, because you are exposed to ambiguous, badly written
specifications for features all the time.
------
dustingetz
This is a startup opportunity and can be solved through UX and richer
moderation tools. UX influences user behavior. One small example, a richer
moderation system that captures _-1 because violation of HN guideline #6
"avoid introducing classic flamewar topics unless you have something genuinely
new to say about them_ could short circuit entire flamewar threads. I think a
psychologist could probably build the right UX to get top 1% expertise to
generate incredible discussions reliably and funnel the 99% peanut gallery off
to the side where they can happily contribute too. I would have liked to see
reddit build this but they have had years and years to do something
interesting, i guess they are too big to innovate now
~~~
jonahx
good rules and UI can help, but good UI won't change the way people are, and
for any given set of rules, people motivated by strong emotions will find
creative ways to break the rules to get their needs satisfied.
the only 100% reliable solution is to completely remove their freedom of
expression (eg, with an all-powerful moderator, whose interpretation of the
rules is law).
------
Lanari
I notice this the most on tech people, I think it's related that we read a lot
of stories about people getting fired for a tweet or a post...
------
kiba
I dunno, man. I write defensively as a default and consequentially nitpick
every argument made for a position.
On the other hand, I am not adverse to changing my mind on a flip of a good
argument.
If I get offended or defensive, it means that my beliefs aren't as ironclad as
I thought.
There are some areas that I pretty much don't pay attention though.
Anything concerning creationism, global warming denialism, pro-smoking, god,
etc.
I won't waste my time entertaining obviously wrong arguments.
~~~
krisdol
The article's message is that non-defensive writing traditionally presumed the
author's openness to (in your case) changing their mind, and that today we
have all become defensive writers because there is now an expectation that
someone will use nitpicks about inconsequential points to discredit the entire
piece.
~~~
itp
> today we have all become defensive writers because there is now an
> expectation that someone will use nitpicks about inconsequential points to
> discredit the entire piece.
Even such inconsequential things as, for example, pointing out the parent's
incorrect choice of "adverse" instead of "averse," which has no bearing on the
argument.
There's almost always a mistake, overgeneralization, or convenient shortcut in
anything a person says. Holding people to an impossible standard just adds
stress all around.
------
amelius
There should be a tool to which you could say: here's an idea, please rewrite
it into defensive form.
------
paulpauper
Another possibility is that the writer doesn't actually know if something is
certain or not.
------
Mz
So, about a week ago, I wrote a piece that I feel has a solid point and I
posted it on Hacker News. It hit 39 karma and had 54 comments. Not impressive
for Hacker News overall, but, hey, decent for my own personal blog's track
record.
The top voted comment on it is incredibly dismissive, completely dismissing my
6-ish years of college, more than 5 years working in insurance, etc. over one
detail that wasn't thoroughly researched and was, in fact, framed to admit
that it was kind of hand-wavy. I don't think what I said was entirely wrong
and stupid, but it wasn't 100% accurate.
(HN item in question:
[https://news.ycombinator.com/item?id=10819091](https://news.ycombinator.com/item?id=10819091))
But I think I have two choices here:
I can be completely bummed that one person chose to completely dismiss
everything based on one not perfect remark in the piece and others upvoted
that. Or I can value the upvotes the item got, the amount of meaningful
discussion it did generate, the page views it got, and the constructive
feedback that tells me that if I want to write more on this topic, I need to
do a bit more research and firm up my ability to defend my points or random
asshats are going to say "Well, based on one single detail, nothing she says
can possibly have any merit."
But, hey, I have done similar things myself:
[https://news.ycombinator.com/item?id=10828054](https://news.ycombinator.com/item?id=10828054)
So, maybe people doing that isn't all asshattery. Maybe it is more complicated
than that and when someone tells you "x detail makes me not
read/believe/whatever any of it", then that is constructive feedback that you
can choose to try to learn something from so you can improve your writing --
or not, as you see fit.
I have spent a lot of years trying to figure out how to express myself
effectively online. I think part of the challenge is that a) your audience is
incredibly diverse, so lazy writing habits that are rife with unquestioned
assumptions which may be racist, sexist, etc are going to get called out
online even though your personal circle of friends wouldn't have a problem
with it (because they are also kind of racist, sexist, whatever) and b) you
are talking to people who may know more than you about some piece of it and/or
can google up info, either to fact check or just to rebut because they don't
like you, don't agree with your point, just got dumped, are IUI (Internetting
Under The Influence) or for any damn reason.
So, in addition to readers perhaps needing to try to "find the insight, not
the error," authors can also try to focus more on the metrics that make them
feel positive about having written it rather than focusing on the comments
that make them cringe.
It took some self control at first, but I am finding that my experience of the
Internet is much better now that I focus more on counting the positives and,
as much as possible, ignoring the negatives. It does involve some judgment
calls. You can't just turn the other cheek on everything. There are situations
where you need to correct people or clarify your meaning or defend your
statement. But if you are getting traffic and getting comments, even if some
of them are ugly, then you accomplished something.
I am trying harder to decide what I want to accomplish, keeping my eye on
evidence that I am moving that goal forward, and not expecting some "perfect"
experience (for lack of a better way of saying that, as I have other things to
do and this comment has gone on long enough).
~~~
flubert
Is there a way to use comments as an editorial device? A wiki-like solution,
combined with a MS-Word Track-changes features, where you can make changes,
and leave comments and it all shows up on the same page? Like with the health
care article example, the first 90% of that seems like pure fluff (when it
isn't making a false or misleading or questionable statements), essentially
padding added to a high-school report to meet an arbitrary length requirement.
The real meat and the potentially interesting content is then almost lost in
the shuffle. A professional editor would request a complete rewrite. You
should be telling us more about DPC, which is new, interesting, and
potentially very valuable.
~~~
Mz
I have no idea if there is a way to use comments as an editorial device.
Why do you think 90% seems like fluff? Based on comments on Metafilter where
all I did was talk about DPC, I felt the background info was necessary.
But I do hope to write a more professional version and I am in talks with a
publication that may take it. So I am keenly interested in feedback.
Thanks!
~~~
flubert
Here's a version that I commented on in Libre/Open Office (an *.odt file), but
I assume you could probably open it in a modern version of MS-Office, and have
the comments come through fine):
[http://s000.tinyupload.com/?file_id=62873318985673539567](http://s000.tinyupload.com/?file_id=62873318985673539567)
------
mwfunk
I'm seeing way, way, waaaaaay too many commenters here who apparently came
away from this thinking that "defensive writing", as described by the author,
has anything to do with not offending people. That's totally orthogonal to
what the author is talking about. He's talking about the tendency some people
have to take everything as literally as possible, finding some logical hole in
an argument based on a very literal (and often wrong) interpretation, and then
start stupid inflammatory debates in comment threads about it. As far as I can
tell it has nothing to do with political correctness, although people saying
so really deliciously proves his point.
The root of it is sometimes that an author didn't express something very well.
In general though, it's because some readers are predisposed to find the most
literal and uncharitable interpretation of something they read, which they
then use as an excuse to try to show up the author in the comments. These
types of commenters are the types of people who don't assume good faith on the
part of other people, and when they see something that doesn't make sense to
them their first thought is that the person who wrote it is an idiot, rather
than to step back and take a moment to think about what the author might've
meant. These types of commenters are looking for excuses to publicly disagree
with someone, they're not trying to participate in an exploratory discussion
in which the participants don't automatically assume the worst about each
other.
For example, what if someone made an offhand comment about C being the de
facto standard for systems programming for the last few decades? On certain
forums, it is pretty much guaranteed that someone will chime in about all of
the various kernels, research OSes, etc. that have been and are currently
being written in languages other than C, and proceed to argue with the poster
about this fact. Of course, they're right in a sense- it's true that C
accounts for < 100.000% of systems programming. But it's pointless to argue
about as it's based on a strawman from an overly literal interpretation of the
original statement.
And it pretty much goes downhill from there. Overly Literal Internet Guy #2
then has to chime in and attack Overly Literal Internet Guy #1 over some other
strawman based on an uncharitable interpretation of what #1 wrote.
Point being, the entire discussion gets derailed by a handful of people who
refuse to give each other the benefit of the doubt and just really want to
argue. We could have been talking about why C is such a juggernaut in the
systems programming world, or what other languages might make sense for
systems programming, or any number of other interesting digressions. But
noooooo, we can't have nice things because a few people just want to scream at
the author and each other about willful, overly literal misinterpretations of
what was written. As a result, many writers will warp their style to put
hedges and defensive qualifiers around every single statement to avoid all
discussion getting derailed by Overly-Literal-Internet-Guys-Who-Think-
Everyone-Is-Else-Is-An-Idiot. By the time all of those qualifiers and hedges
go in, it's possible that the point of the original statement is diluted or
lost entirely.
And of course, in what I've just written, I myself am no doubt guilty of
assuming bad faith on the part of other people and am just as guilty of the
behavior I'm complaining about. I get it, I really do. This is a type of
cognitive bias that everyone has, including myself. There are no easy answers
when it comes to understanding human interactions or our own mental processes,
we can only try our best. I only take issue with people who think they are
infallible and don't even try to play devil's advocate with themselves when
thinking about things they care about.
~~~
chipsy
There's a trick one can pull if you decide to engage with literal thinkers
head-on, and that is to give them bait by inserting a "great error" somewhere
in the text, for example:
> ...this method is justified, because we know the earth is flat.
This is one of the core techniques in esoteric writing. What the great error
does is intentionally push the dynamic of the conversation away from
agreeability on the surface towards "picking apart and puzzling". People who
aren't inclined to think about things will miss the great error entirely and
go on without feeling the need to remark(since the essay's surface will
usually be framed towards an audience's status quo), but once the error is
discovered, the rest suddenly falls into doubt, and then literal thinkers are
suddenly confronted with an essay that is far more challenging, because your
error was not just an error, but a trap to force them to think about the other
statements carefully in order to "prove you more wrong". It is their thinking,
and not yours, that becomes warped. To pull this off the essay has to hold a
tension between truth and falsity where some statements hold true even under
inspection, while others don't.
This also makes the resulting comment threads much more lively since they will
consist of smart people falling over themselves to show how wrong and stupid
you are, only to find their minds are changed by the end.
------
leppr
Not me.
(please no downvote)
~~~
vlunkr
I assume this is a joke that people aren't getting?
~~~
leppr
Well, low effort one liners are frowned upon for good reasons so I don't mind
the downvotes but I couldn't resist. I just wish the same treatment was also
applied to non-joke useless comments. _" I too have a baby and I love him"_,
ok great.
| {
"pile_set_name": "HackerNews"
} |
Show HN: Everything about your movies from the command line - iCHAIT
https://github.com/iCHAIT/moviemon
======
jpstory
I like the idea! Unfortunately I hit an error at 33% of my scan:
Traceback (most recent call last): File "/usr/local/bin/moviemon", line 9, in
<module> load_entry_point('moviemon==1.0.11', 'console_scripts', 'moviemon')()
File "/usr/local/lib/python2.7/dist-packages/moviemon/moviemon.py", line 61,
in main util(args) File "/usr/local/lib/python2.7/dist-
packages/moviemon/moviemon.py", line 77, in util scan_dir(docopt_args["PATH"],
dir_json) File "/usr/local/lib/python2.7/dist-packages/moviemon/moviemon.py",
line 281, in scan_dir data = get_movie_info(name) File
"/usr/local/lib/python2.7/dist-packages/moviemon/moviemon.py", line 300, in
get_movie_info return omdb(movie_info['title'], movie_info['year']) KeyError:
'title'
| {
"pile_set_name": "HackerNews"
} |
Ask HN: Developer workstation setup to promote startup culture? - dotBen
We’re a new startup, about to bring developers on the payroll and have them work out of our new office space.<p>I’m interested to hear HN’ers views on how best to set up developer’s workstations to promote both the startup mentality we want and also for good productivity. Based on my years as a developer at both startups and BigCo’s (and now I’m the co-founder), my proposal is:<p>Developer is provided a brand new iMac and a second Dell screen connected and set 90 degrees in portrait mode (as seen at Pivotal Labs, and others). Dell screen for writing code and the iMac screen for seeing results in the browser (we’re a web app)<p>We want developers to work almost all of the time in the office at the early stage of the company so collaboration between the team can occur. This is the thought behind issueing iMacs over laptops. We also want to create a culture where the work iMac is for dev and you can bring your own laptop in and hook it to our network for your own stuff (email checking, IM’ing, checking your bank online, etc).<p>I would be interested to hear other’s views on laptop vs fixed computer to encourage work to take place in the office, and also the idea of being strict on having work dev machines ‘clean’ of personal apps and use by encouraging the developer to use their own equipment for personal use during work.<p>Thanks
======
samstokes
You don't really say what specific startup mentality you want to promote, but
it seems to me a good generic startup mentality would include creativity and
flexibility. That suggests the strategy: let each developer pick her own work
rig. Give them a budget and maybe some suggestions, and let them spec and
configure.
If the developers you're bringing on board are any good, they'll have a
favourite OS / IDE / byzantine .vimrc / etc. By letting them use it, you'll
gain in productivity, and - particularly if they've previously worked for
BigCos with silly tools restrictions - you'll gain a lot in morale. In my
experience the thrill of "my employer bought me a shiny new Mac" wears off
after a month, whereas "my employer encourages me to use the tools I'm best
with" provides ongoing job satisfaction.
Regarding the work/personal split, it sounds like what you're worried about is
people wasting time at work, getting distracted, or otherwise being less
productive. IMHO that's a motivation issue, not a hardware issue. Good
developers with interesting problems to work on need to be reminded to eat,
not trained to work harder.
~~~
dotBen
The idea of "let the developer pick his own rig" had been mentioned by a lot
of people - it's a fair point and has me reconsidering.
My two thoughts are:
__Software __Granted, the most important software is probably the IDE and
those tend to be cross-OS (you can get eclipse or VIM to run on any OS) but
you get issues on other software like Word. Suddenly that 5-seat license of
Office for Mac doesn't accommodate my windows 7 developer so now I have to buy
more licenses for him cos he uses Windows. And my Linux developer can't open
the document in OpenOffice because it's using change tracking that his version
doesn't support, etc.
We also can't use stuff like SubEtherEdit or Coda for join-development,
2) Scaling a dev team's needs - if we're designing for scale then we might
have 10 developers by the end of 2010. It becomes a nightmare when half of
your dev team is on mac, thee on windows, one on debian/ubuntu, one on fedora,
etc.
But maybe these are not good enough reasons...
~~~
etherael
1) Standardising on office software tools that are essential to the work of
the dev team is in my humble opinion, a terrible idea. Yes, this means the
microsoft stack, it tends to make people jump through hoops they're not fond
of jumping through and in all instances I can think of it's actually better to
be doing it some other way.
2) Standardising on _anything_ that requires Mac OS means that you're pretty
much stuck with people that are happy with Mac OS, and while admittedly that
is not a small number of capable devs, others really hate it, and with very
good reason.
3) Virtualisation can offset "essential" items in the standard operating
environment, as long as they can run on platforms that can actually be
virtualised, which is my Mac OS is such anathema here.
4) Having virtualised environments that are self contained working images of
everything required to "be productive" takes the guess work out of "Does
employee X have everything necessary" and drastically alleviates any crippling
hardware issues, if all it takes to keep going is a virtualisation image and a
new machine, disaster recovery becomes very trivial.
5) If you're really insistent on the whole "having everyone at the office"
idea, you could even try trading out virtualisation for remote access to
various environments. Once again, this is an Everything But Mac OS solution as
OS X remote access is terribly bad compared to Windows or Linux based
solutions (RDP / NX vs VNC)
6) That said, I think tying your team to their desks is a good way to annoy
some people, long term if you decouple your team from the requirement to
remain physically present you are a much more attractive opportunity than
otherwise. Go distributed whenever possible, this means dev environments that
have full deployments of whatever is necessary to work 100% offline. Git over
subversion, make sure they have copies of databases that are required to
actually run the code they're writing, etc etc etc. Long term goal should be
that they can work from absolutely anywhere any time regardless of network
access.
------
BigCanOfTuna
Give each developer a hardware budget equal to the value of a top of the line
MacBook Pro 17"....but each developer 'chooses' the 'laptop' he wants to buy
(be it OS X, Windows, or Linux).
\- the developer gets the hardware he wants to work with
\- the developer feels a sense of 'ownership'
\- the developer respects you for allowing him to choose
Mandate laptops because you super-awesome-change-the-world startup is going to
inspire your developers to work hours outside of the office, even if you do
mandate 8-5 office day.
Every developer gets two monitors, no exception.
You can't encourage a person to be an Olympic rower by simply providing a
stool and a stick to paddle with.
------
mrshoe
As much as I love Macs, I think I can be more productive writing code on a
linux box. I'm going to assume that applies to everyone else :-), and say you
should save some money on desktops by getting everyone a linux box with 2 Dell
monitors.
Spend that money (and more) on getting everyone a MacBook Pro as well. As
anonjon points out, if you don't give out laptops you might be _discouraging_
working outside the office more than _encouraging_ working at the office. An
important part of startup culture, imho, is to not require face time at the
office.
Shameless plug: I agree with you that encouraging collaboration is key. Since
lots of teams are working in a distributed fashion these days, we think they
need some other way to collaborate effectively. This is why we're building
<http://shoptalkapp.com> . We've found that it even encourages more
collaboration for those that _are_ in the office.
~~~
dotBen
If you buy everyone a linux box, I'm not clear why buy a MacBook Pro too?
If we were going to do that we could just buy MacBook Pros and be done with
it.
~~~
mrshoe
I'm honestly not sure if one can hook up 2 20" or 24" monitors to a MacBook
Pro these days, but even if you can't it's nice to have a desktop with a mouse
and keyboard and not have to plug/unplug your laptop from everything all the
time. Also, I prefer linux on the desktop (for my work machine, not personal)
and Mac OS on the laptop, but that's just me.
~~~
itay
I have a 30" monitor (with dual DVI) at home, and not being able to use it
with my mini-DP MBP is a tragedy, let alone being able to connect a single
laptop to two monitors...
------
nzmsv
This sounds like a pretty terrible culture you want to create. Why control
what people do with your machines at all? (on separating email from
development) It's a completely arbitrary rule that makes no sense. Hackers
tend to question those :)
If you really want to make developers happy, give them a budget and let them
buy a computer. Several people in the comments said they like Mac laptops. I
personally love my Linux desktop machine with a good keyboard. Which is the
point: we are all different, and have different preferences.
If you don't trust your developers that much, at the very least let them
configure their software the way they like. I'm sure your startup needs the
coders to be productive. Don't prevent that.
------
wglb
I think promoting the startup mentality is not really about the smaller
details about the hardware. It is about giving people challenges and a good
collaborative place to work. The hardware won't do that on its own.
The two screens sounds like it would help the immediate task, but I personally
would not buy dell of anything. Samsung screens are great.
If you are a startup, I would not buy the macs, but I do buy the motherboards,
cases, memory and disks and assemble them. Takes about an hour. You can put
together a kick-ass development system for $500 not including monitors.
Laptops are about half as cost effective or worse compared to desktops.
~~~
dotBen
"I think promoting the startup mentality is not really about the smaller
details about the hardware. It is about giving people challenges and a good
collaborative place to work. The hardware won't do that on its own."
You are right, and my point was not to say the hardware setup is the be-all
and end-all. But the scope of my question for HN in this instance was hardware
setup.
I actually very much disagree with your idea of building your own computer...
the time it takes to order all the components, receive it, put it together,
tinker with the setup, etc is going to be longer than an hour. And if
something breaks/goes wrong with one of the components then it's a nightmare
trying to get it fixed. With an iMac (for example) it's just a case of getting
AppleCare and then if something goes wrong they come by the office the next
day and repair it or replace it. It doesn't matter what is wrong with it, or
which part is at fault
My team's velocity is important and making sure I can be certain their rigs
are stable and can be quickly fixed/repaired is important. (Not every
developer likes to put computers together either, some may not have even done
it before)
I also disagree about the $500 idea - two screens would probably be $500 alone
and then conceivably the developer might want to have windows on there, which
is an extra $200 now (windows, {shudder}).
~~~
wglb
Well, the hour is measured, not theoretical. I have done it myself on about 14
of the computers here.
The price I pay for the hardware is actually $300 retail quantity one, and the
displays are about $200 each. I said $500 in case you want more beef in the
system, say quad cores or such--I use dual. I load these up with ubuntu and
off we go. I would rather spend larger dollars on the chairs (not a joel
joke).
I order from tiger direct and have it the next day.
EDIT: I understand your point about giving them macs instead of windows. My
situation is not web-based, so we are developing for linux servers.
------
Scott_MacGregor
I like the dual screen setup, but I would say no laptops because backing up a
take-home laptop won’t happen on a set schedule with the rest of the
workstations overnight. Plus if someone takes one of the laptops to Starbucks
and gets a virus online, when the machine comes back on your LAN it can create
some real headaches for everyone else. I know people love laptops but that is
my opinion on them, plus I think they run hotter than a box machine meaning
more frequent disk errors.
I would recommend letting your people check personal e-mail, IM, etc..., from
the work machines. It really won’t hurt anything and it will make for a
happier employee. I would also not want personal laptops on the company LAN
because of the chance of spreading viruses or stealing IP.
As far as software, any e-mail client that downloads a copy of the email to
the employee’s machine as well as leaves a copy on the server would be my
recommendation. If the person wants to use the client for personal mail too,
split the mailbox inbox folders based on [email protected] and [email protected].
I would also add MS Word and MS One-Note which a lot of people seem to like.
Personally I like Windows Live Messenger because it can send and receive free
SMS (personally I use it a lot). Also if you’re not backing up in-house add a
backup client like Mozy.com and your all set.
You might also want to put a policy in place that the employee cannot self
install any software without your approval. Basically, if someone wants to add
Trillion, or their favorite development workbench they must ask because the
hardware does not belong to them and you will know what’s on the machine if it
fails and must be restored.
~~~
dlaz
Don't require people to get approval before installing new software. That's a
pain for the developer and the person who needs to do the approval and will
likely just result in your developers not trying out tools that might help
them be more productive.
------
petercooper
Do what you like (of course!) but it would drive me nuts to have a "work
computer" and a "personal computer." I love the iMac, but I love my portable
too, and I can work _and_ do personal stuff on both. Personally, I'd suggest
not enforcing anything too tightly (especially that 90 degree screen thing -
not everyone likes it.. I hated it).. but it's your gig, so you can do
whatever you like, naturally.
------
rudd
Why do you want to keep the use of a work computer and a personal computer
separate? I suppose I can understand wanting to keep personal data on your own
computer, but people aren't going to be happy being forced to drag a laptop to
and from work just to check personal stuff during the day. They won't see it
as you being lenient about using personal stuff on the network, they'll see it
as you being very strict about what touches your precious work computer.
You should let people choose what sort of setup they want (within reason). Let
them get a desktop if they just want to work in the office (which seems to be
what you want anyway), and let them get a laptop if they prefer that. Linux,
Mac, Windows, whatever. Let developers choose their own machine setup and
they'll thank you for it. I can't even say how happy I was that I wasn't
forced into a particular setup at my current job.
------
ablerman
The startup culture around here is typically that of flexibility. We work long
hours, we don't keep track of vacation time and I don't care if you sit at
your desk.
Currently, my favorite spot is the office couch. Maybe next week it'll be the
conference room table.
I encourage people at startups to integrate work with their life. It's not
just a job.
That being said, my preferred rig is a macbook with wireless. It comes to the
table for demos on the projector, it can sit at my desk or come to the couch.
------
petervandijck
Mmmm...
It feels like you're treating your developers somewhat condesceningly. Go with
laptops dude, and extra screens (2 large ones each) at the office. That's what
developers like.
------
kbob
"Welcome to StartupCo. Here, take my old laptop. You can use this until you
get your own system. Your budget is $2,000 and that includes any software
licenses you'll need this quarter. Go online and order your new system and
after that I'll introduce you to Ben who knows all about the code you're
picking up. About 20 minutes? Oh, and have it air freighted. I need my laptop
back."
------
bayareaguy
At the last place I worked at with this sort of environment, I found the thing
which increased my productivity the most turned out to be the several open
wireless access points which were purposely setup _outside_ the company's
internal network.
~~~
dotBen
You mean because your internal network was firewalled/restricted etc?
We're a startup so our network is just our router plugged into the internet :)
------
anonjon
Right now I have a laptop hooked up to a 24" monitor. It is sweet.
The nice thing about having it as a laptop is if you have to go collaborate
you can go to a meeting room and both sit down with your laptops. As opposed
to, i don't know, really...
At my current work I often save a lot of time (and save myself from abject
boredom), by bringing my laptop into meetings that are not really relevant to
me, and hacking until the part of the meeting that I need to pay attention to
is happening.
It seems that you are very insistent on the idea that the iMac will cause work
to take place in the office. Really what it will do is cause work to _not_
take place other places.
Lets say I have an iMac and we really need to get X feature over the goal line
soon. If i want to put in a couple of extra hours at night to do that, I'll
have to stay at the office. I don't particularly like the office (and its
spooky at night there...), so I'll probably not do that.
If I have a laptop, I'll probably just unplug the laptop at 5:00 and take the
darn thing home. Then I can work on whatever while eating caviar and watching
MythBusters on FULL Cable, because of the billions in stock options i've made
from our AWESOME web app.
I mean, if you want people to be at the office most of the time, I think you
are well within your rights as an employer to say, 'Hey, i expect you to be
here between X-hour and Y-hour'.
Anyway, iMac and Dell sounds very branded. I'd go with solid laptops running
some version of Linux (whatever you are deploying on), and the nicest/biggest
secondary screens that you can get on the cheap.
~~~
dotBen
"It seems that you are very insistent on the idea that the iMac will cause
work to take place in the office. Really what it will do is cause work to not
take place other places."
That's actually part of the point. I would like developers to put in good
hours in the office but I don't want to create a culture of 80hr work weeks.
I'm mindful we're a startup and it's about putting in the hours - but my
experience has been that "putting in the hours" often includes the
procrastination in the office during the day OR it means doing lots and lots
of work and the quality drops off as it's not sustainable.
| {
"pile_set_name": "HackerNews"
} |
Ukrainian astronomers: "Asteroid could collide with Earth in 2032" - galaktor
http://www.telegraph.co.uk/science/space/10388090/Asteroid-could-collide-with-Earth-in-2032-say-Ukrainian-astronomers.html
======
galaktor
from the article:
Nasa played down the possibility of impact, with Don Yeoman, manager of the
administration’s Near-Earth Object Profram Office, saying: "The current
probability of no impact in 2032 [is] about 99.998 per cent.
"This is a relatively new discovery. With more observations, I fully expect we
will be able to significantly reduce, or rule out entirely, any impact
probability for the foreseeable future."
| {
"pile_set_name": "HackerNews"
} |
Is medium.com messing with archive sites? - awjr
An article was deleted earlier today and I tried to view it using google cache http://webcache.googleusercontent.com/search?q=cache:443n-Pc8f-wJ:https://medium.com/%40wob/the-sad-state-of-web-development-1603a861d29f+&cd=1&hl=en&ct=clnk&gl=uk only to find the cached page is redirecting to medium's home page.
======
DrScump
If the author deletes the article by choice (I saw this yesterday, in fact),
maybe it deletes from the archive also?
| {
"pile_set_name": "HackerNews"
} |
World's Greatest Extra - DanielRibeiro
http://www.youtube.com/watch?v=IdEBu7ODVk8
======
PankajGhosh
First this seemed unreal, then probable.... and then I was able to find him...
Jesse Heiman
<http://www.imdb.com/name/nm1035503/>
| {
"pile_set_name": "HackerNews"
} |
State Dept.: Clinton IT aide's email archive is lost - a3n
http://thehill.com/policy/national-security/279233-state-dept-claims-to-have-no-emails-from-clinton-it-aide
======
luso_brazilian
See also the "Bush White House email controversy": [1]
_> The "gwb43.com" domain name was publicized by Citizens for Responsibility
and Ethics in Washington (CREW), who sent a letter to Oversight and Government
Reform Committee committee chairman Henry A. Waxman requesting an
investigation._
_> Waxman sent a formal warning to the RNC, advising them to retain copies of
all emails sent by White House employees. According to Waxman, "in some
instances, White House officials were using nongovernmental accounts
specifically to avoid creating a record of the communications."_
_> The Republican National Committee claims to have erased the emails,
supposedly making them unavailable for Congressional investigators_
It is hopeless to expect voluntary transparency by the government, there is no
incentive to
1) comply with the rules when transparency is clearly detrimental to your
backroom deals or
2) to punish the very people that appointed you to your high positioned job or
3) to punish the opponents in the other party when they fail to comply while,
after 4 years, the positions can be reversed and you may be in the crosshairs
in a similar investigation.
Nothing was done when Bush did it, nothing will be done now that is Hillary
doing it and don't expect anything different regardless of what party manage
to snag the congress and White House.
If regulatory capture is a significant problem in the private sector imagine
inside the government itself. It is hopeless.
[1]
[https://en.wikipedia.org/wiki/Bush_White_House_email_controv...](https://en.wikipedia.org/wiki/Bush_White_House_email_controversy)
~~~
tanderson92
The Citizens for Responsibility and Ethics in Washington (CREW) filed the
initial suits, suing for information related to Hillary Clinton's e-mails.
Then they were taken over by Clinton acolyte and hatchet-man David Brock,
using his super PAC money. They have since had a, um, strategic restructuring.
Incredibly, this line of inquiry has been dropped by CREW.
~~~
DanielBMarkham
If you're interested in corruption in government, the CREW story is incredibly
interesting. Most of these agencies have internal quality control and feedback
systems, notably the Inspectors General.
In many cases these systems are no longer working like they should. NGOs tried
taking over that role.
Now we're seeing the NGOs being "strategically restructured"
Who's left? The press? They suck up to whomever is in power. If they don't,
they get locked out. SV? All they care about is keeping the money machine
going.
I'm looking around and not seeing much of an answer.
~~~
blowski
Perhaps forums like HN and /r/politics genuinely are the best we have.
~~~
TheRealDunkirk
Seriously? The prevailing biases of both, while not making them completely
useless, prevent them from being a useful feedback mechanism. /r/politics sure
isn't going to hold HRC accountable for an illegal email server. It's _why_
/r/the_donald has become what it is.
~~~
llamataboot
My general feeling about r/the_donald is that it is a perfect example of Poe's
Law [1] driving a feedback cycle of continued irrational extremism. I used to
think i had a pretty good eye/ear for spotting subtle sarcasm and trolling,
but I honestly haven't been able to tell what is a serious post in that
subreddit for months now.
[1]
[https://en.wikipedia.org/wiki/Poe%27s_law](https://en.wikipedia.org/wiki/Poe%27s_law)
~~~
Karunamon
I think you may be reading it the wrong way. I've been in their IRC channel a
few times - there is very little trolling happening in there, lots of genuine
enthusiasm.
..combined with a stated and enforced policy of zero tolerance for public
dissent, mostly as a defense mechanism from the divide-and-conquer concern
trolls who tend to infest political communities. They characterize it as a
"24x7 Trump Rally", and that's exactly what it is.
If you're more interested in the discussion than the echo chamber, there's
/r/asktrumpsupporters. You'll find little of value on the main subreddit
unless you actually support the guy.
------
gthtjtkt
Shouldn't they be in the Datto backups? The ones that even Clinton herself
didn't know about because they were kept by mistake (and against her orders)?
The ones that contain the 30,000+ emails she and her lawyers attempted to
delete because they were "personal"? The ones that _still_ haven't been
released despite numerous FOIA requests and a lawsuit?
I have a feeling we're only getting one side of this (the State Department's)
while the FBI already has these emails. The FBI would have no reason to chime
in and tell the public "It's ok, everyone, we have those emails too." And the
State Department obviously wouldn't have them since she wasn't using their
email system.
I'm 99.99% sure nothing is _missing_ here, it's just not where they expected
it to be. Pagliano already struck a deal with the DOJ and appears to be
cooperating, which probably isn't a good thing for Clinton.
~~~
djrogers
> Pagliano already struck a deal with the DOJ and appears to be cooperating
No, not really -
[http://bigstory.ap.org/article/a2bf597d0dec4780af6f8710822ee...](http://bigstory.ap.org/article/a2bf597d0dec4780af6f8710822ee9d7/ex-
clinton-staffer-again-says-no-testifying-congress)
Unless he changes his mind again or is compelled to testify...
~~~
gthtjtkt
He's cooperating with the FBI and DOJ. Do you really think they'd offer him an
immunity deal that allowed him to plead the 5th?
What you're talking about are separate investigations by Senate committees.
It's understandable that he has no desire to participate in those dog and pony
shows.
~~~
snuxoll
I may be mistaken, but judicial immunity automatically waives your fifth
amendment rights since you can no longer incriminate yourself. I can't really
imagine the point otherwise.
------
chishaku
Can't the FBI just query the NSA's impressive archive?
They don't even need a warrant.
[https://www.aclu.org/doj-report-fbi-activities-under-
section...](https://www.aclu.org/doj-report-fbi-activities-under-section-702)
~~~
spacehome
Maybe they did and the rest is parallel construction?
~~~
daveguy
Parallel destruction?
~~~
fab13n
No, parallel construction. That's what law enforcement does when it has
illegally gathered proofs: knowing what they're looking for, they figure out
another way to "find" it again, but this time through legal and seemingly
lucky paths. The later proof is expected to stand in front of a judge, whereas
the "real", original one wouldn't. They might also do it to protect a
confidential informant.
Essentially, it's to legal proofs what laundering is to money.
~~~
giardini
Daveguy was making a joke with the "parallel destruction". I think he already
knows what parallel construction is, but your clarification here is well-
placed for those unacquainted with the concept.
------
spoiledtechie
Isn't this the least bit fishy? To say that politicians are held to a
different standard than the average citizen is in full effect here. Its a HUGE
cover up. How is she not in jail yet?
~~~
sebular
You're right, politicians are held to a different standard-- a much more
strict one. Clinton had her own private email server set up. I'm sure that
many people on this forum can back me up on this one, it's quite legal to set
up an SMTP host on your own domain.
The issue is that she was apparently told to use the .gov email address
provided to her by her work, and she did this instead.
People have accused her of doing this for various reasons, but personally I
don't believe there's evidence to show that she did this for any other reason
than wanting to stand out and look special by sending mail from
[email protected], and so that she could use her own BlackBerry.
~~~
morgante
If I told my employer that I refused to use a company email address, they
would absolutely be suspicious.
And that's for a lowly developer job dealing with absolutely no classified
information.
~~~
GVIrish
That's not the right analogy. It would be more like if you were a famous
business person brought in to be CEO, and wanted to keep using your personal
email.
You'd probably still catch some flak but you might have enough power/influence
to actually do it whereas if you're employee #42593 of a big company there's
no way in hell you're gonna get your way.
~~~
morgante
Oh, I agree that a CEO might get away with it. That doesn't make it right.
Similarly, Clinton will probably get away with it. That doesn't excuse her
behavior though. Might does not make right.
~~~
GVIrish
You're absolutely right. I think a lot of people who get to the top in
politics or business have an extremely high level of confidence and
determination, but sometimes that turns into arrogance and a disdain for
rules. Clinton is one of those people that I think has an attitude of 'rules
are for little people'.
Hopefully she'll keep that character flaw in check as president, but I
wouldn't be surprised if she didn't.
------
DanielBMarkham
I have a feeling that this case is going to be studied for years no matter how
it turns out.
In ethics the appearance of impropriety is generally considered enough to be a
breach. I _think_ most independent observers would admit that there's plenty
of bad-looking stuff here, however with the election season in full swing I
could be wrong.
More interesting, I imagine, is the legal maneuverings going on. Assuming that
something went way wrong here, and please enough with the clintonesque _tu
quoque_ arguments, it's fascinating comparing what happened to the witnesses
after the investigation was launched here compared to, say, aaronsw.
In Aaron's case, a school set up a camera and caught him accessing files he
shouldn't. In the Clinton case, scads of sensitive emails were sent all over
the internet from a server that shouldn't be handling them.
The difference in evidence here looks astounding, especially if you know
something about email.
In the first case, a lone prosecutor is able to use the full force and weight
of the U.S. government to harass and threaten. There's a small number of
people involved, limited resources, and lots of press.
In the second case, over a hundred FBI agents are working hard night and day,
one supposes, but evidence goes missing, hard drives which need to be in
evidence are instead sent to secure destruction facilities, and we can't even
seem to be able to to locate low-value secondary information. Press consists
of lots of pieces slanted against the government's case.
My money says nothing comes of this. I'd give 50-50 odds that in some fashion
it's tossed to the Congress to consider impeachment, and of course that'll
never happen.
There are a lot more incredible contrasts between these cases. I hesitate to
mention them due to folks getting upset about their political candidate not
looking so good.
~~~
giardini
_"...over a hundred FBI agents are working hard night and day, one
supposes..."_
NBC says there are _12_ FBI agents working on the case:
[http://www.nbcnews.com/news/us-news/fed-source-
about-12-fbi-...](http://www.nbcnews.com/news/us-news/fed-source-about-12-fbi-
agents-working-clinton-email-inquiry-n548026)
titled
"Fed Source: About 12 FBI Agents Working on Clinton Email Inquiry".
Undoubtedly they are a hand-picked group (although whose hand did the picking,
I don't know).
Furthermore I fear that FBI agent Comey is in Clinton's pocket/corner. Comey
gave Hillary Clinton a bye in the Whitewater investigation:
[http://www.westernjournalism.com/wow-something-
from-20-years...](http://www.westernjournalism.com/wow-something-
from-20-years-ago-is-a-major-clue-about-what-fbi-will-do-to-hillary/)
Comey has done well in his career in the years since Whitewater.
~~~
DanielBMarkham
I must be repeating what I heard in March.
"Official: FBI team on Clinton email probe not near 150"
[http://www.politico.com/story/2016/03/how-many-fbi-agents-
hi...](http://www.politico.com/story/2016/03/how-many-fbi-agents-hillary-
clinton-email-221299)
The Washington Post reports "dozens" [https://www.washingtonpost.com/news/the-
fix/wp/2016/03/28/th...](https://www.washingtonpost.com/news/the-
fix/wp/2016/03/28/there-are-147-fbi-agents-involved-in-the-hillary-clinton-
email-investigation/)
------
404error
I still don't understand how someone who is being investigated by the FBI can
run for President of the United States and have this be perceived as a non
issue.
~~~
mikegioia
Welcome to the circus that is the 2016 Presidential Election. She's running
against "The Donald" ffs! The whole thing is just disgusting, top to bottom.
~~~
404error
Ha! Yes, it is disgusting. Can you imagine applying for a job and the employer
finds out that you are being investigated by the FBI. I would imagine that
there would be some questions/concerns about hiring you. You can't just brush
it off and say "Oh that, don't worry about that."
------
Mendenhall
The RNC also said they have no text messages or blackberry messages sent to or
form Clinton during her time in office lol the state department declined to
discuss that, isn't that nice.
"In addition to the emails, the State Department also does not have any text
messages or BlackBerry Messenger messages sent to or from Clinton during her
time in office, the RNC claimed. The State Department declined to discuss that
declaration."
------
pasbesoin
Audit-ability is essential to accountability is essential to effective
government -- in our case, a democracy (well, ok, a republic).
Failure to provide for this should be considered outright failure in one's
job. Failure to serve a fundamental tenant of our form of government.
To the extent it is define as a crime by our laws, it should be fully
prosecuted.
In the private sector, I've cleaned up far too many ill-defined, ill-
documented messes. Private or public, these things end up being and serving
gross dereliction of duty. I've seen it over and over.
~~~
smileysteve
The counter perspective to this is that Hillary's privately managed server has
better audit-ability (and could be interpreted to higher security standards)
than government IT.
The big "flaw" that I see in this "crisis" is the assumption that government
servers (which have knowingly and publicly been hacked: OPM - or have leaks:
Snowden) are somehow more secure than a given private server. And we can all
probably find both a government and private website that doesn't use TLS/SSL,
SPF, or authenticated SMTP.
~~~
pasbesoin
Thank you. I, for one, don't assume government server configuration and
management is necessarily better.
Another thing that bothers me about Clinton, is that years after Manning's
leaks (which I do not condemn, personally), word was/is that State's document
/ information systems situation is as messed up and deficient as ever.
They had a "big problem", by their own definition and rhetoric. Yet, years
later -- years under her watch -- apparently nothing substantial has been
fixed. (I may be wrong on this, but it's the latest that I recall reading on
the matter.)
------
pdabbadabba
They don't have a PST archive, but they still have at least some of his
emails. FTA:
> “The absence of this email file, however, does not indicate that the
> department has no emails sent or received by him,” she added. “In fact, we
> have previously produced through [the Freedom of Information Act] and to
> Congress emails sent and received by Mr. Pagliano during Secretary Clinton’s
> tenure.”
------
hbrid
Makes me wonder whether NSA is seriously involved. If they were to have
vacuumed up and provided all these deleted emails to the FBI in the course of
this investigation, that will have been the absolutely most benevolent
application of broad surveillance powers I can think of.
~~~
dTal
The NSA quietly and illegally helping an investigation into a presidential
candidate in an election year is pretty much the antithesis of benevolence,
regardless of what you think of Hillary. That kind of activity is basically
worst-case scenario, war-on-democracy level stuff.
------
sschueller
Let's hope the Hacker that claims he got in has a backup and can supply it to
the FBI.
------
bitL
Obstruction of justice anyone? Ooops, wrong caste!
------
diakritikal
No coverups here at all.
~~~
johndevor
Move along, _citizen_.
~~~
jefurii
And pick up that can while you're at it.
------
dccoolgai
If there are emails that were deleted in a .pst file, they are (or at least
were back when I was a MS desktop admin 8ish years ago) entirely recoverable
by remapping the index at the head of the file (i.e. the messages are "soft-
deleted")...should be a piece of cake to recover...
------
abhi3
It increasingly seems that they knew what they were doing since the beginning
when they decided to use the private server rather it being a
mistake/oversight. One can just hope this drama gets over before the
convention.
------
mariusz79
NSA should have a copy...
------
OliverJones
.pst? Microsoft Outlook? Really? Wasn't that software package very exploitable
in the 08-12 timeframe? It seems likely the server was some version of
Exchange.
Why would anybody in her right mind -- not to mention a powerful and visible
government official -- operate her own email server?
In that timeframe there were several SaaS email providers that were very
secure. A couple were even certified as HIPAA Associated Business Entities
(for certain limited use cases). That's a high standard of security.
It's not hard to believe that servers for staff use operated by a department
of the US federal government were ancient, poorly maintained, and vulnerable.
It's almost certain they had unreasonable usage limitations. "Sorry, your
mailbox contains 24 / 25 megabytes so it's almost full." Plus, they probably
thought they were cool because they had a T1.
But why couldn't a powerful person like Secretary Clinton figure out how to
contract with a competent SaaS? I bet the big email provider on Amphitheater
Road would have jumped at this project.
Finally, how come nobody involved in this cluster __*k has ever answered the
question "why?"
~~~
smileysteve
> In that timeframe there were several SaaS email providers that were very
> secure. A couple were even certified as HIPAA Associated Business Entities
> (for certain limited use cases). That's a high standard of security.
Thank you for saying this.
In this crises we're comparing a government IT department that can't keep
audit logs via .pst, imap, or exchange and comparing it to a non-cloud
provided private email server.
Both should be considered inept in 2008 and certainly now. Unfortunately, the
assumption is that the government IT is somehow trustworthy.
------
atemerev
I'd look at the same place where those IRS email backups are. Ask Lois Lerner.
------
jpollock
After the Microsoft anti-trust trial, people started to realize just how much
information there was in an email history log. Later trials just hammered it
home.
In response, everyone added a chapter to their document retention policy to
cover emails. Emails are typically kept for 1 year, with exceptions for
specific named individuals (CEO, Member of Parliament, Senator, Secretary of
State, you get the idea). Secretary of State's retention period seems to be 30
years [3].
Individuals would be able to mark a document to be kept, but that's the
exception rather than the rule.
As soon as a court case where a person is mentioned starts, that person's
documents are typically retained regardless of schedule - destroying docs that
might be material is really frowned upon by the courts.
All of this is typically mediated by a custom document retention server which
costs a lot of money.
The state department has a document retention policy [1], and I think this
section [2] covers email. It lists retention at TEMPORARY:
Transitory Files (including in electronic form)
TEMPORARY: Destroy immediately, or when no longer needed for reference, or
according to a predetermined time period or business rule (e.g., implementing the
auto-delete feature of electronic mail systems).
There might be something specific, but they all seem to be 1-2 years.
With that in mind, let's look at this specific situation. The controversy
started in March 2015. That puts Feb 2013 - the date the RNC is saying marks
the end of the period they want revealed, outside of the document retention
window (Controvery Start - 2 years).
However, once asked you have to _search_ for a record. So the State Department
would have searched through backups, there may be an old tape sitting around
containing a backup of his personal computer. Nope, they couldn't find a copy
of the PST (Microsoft Exchange or Outlook!). They would have found emails that
he was party to where the other person was on a longer retention schedule.
They may still find others if he sent emails to departments that don't have
retention policies.
Since this is a government department, all of this is defined by statute -
it's up to Congress to set the time periods. If they want it changed, they
need to pass a bill and then fund it because document retention is expensive
(think format conversion).
Finally, I understand that finance companies (stock traders) have additional
retention requirements, even covering chat/SMS/etc messages.
There is no cover up here, just proper application of good policy.
[1]
[https://foia.state.gov/Learn/RecordsDisposition.aspx](https://foia.state.gov/Learn/RecordsDisposition.aspx)
[2]
[https://foia.state.gov/_docs/RecordsDisposition/A-03.pdf](https://foia.state.gov/_docs/RecordsDisposition/A-03.pdf)
[3]
[https://foia.state.gov/_docs/RecordsDisposition/A-01.pdf](https://foia.state.gov/_docs/RecordsDisposition/A-01.pdf)
------
jonny-bravo
"Lost"
------
ck2
Oh just give this nonsense up.
WTF are y'all hoping to find? Seriously.
If you have issues with Hillary there are plenty of policy issues you could
attack and at least I'd respect your right to argument.
------
awt
Maybe Lois Lerner has it?
------
Cozumel
Well, that's convenient!
------
a3n
Huh.
~~~
reefoctopus
>The State Department has lost all archived copies of the emails sent to and
from the man believed to have set up and maintained Hillary Clinton’s private
email server during the four years she served as secretary.
| {
"pile_set_name": "HackerNews"
} |
Most honest cities - snambi
http://www.rd.com/slideshows/most-honest-cities-lost-wallet-test/
======
function_seven
Any statisticians want to chime in here? How many wallets should be dropped in
each city before you can reasonably rank them? Or, is it statistically
significant that Lisbon was 1/12, while Helsinki was 11/12? In other words,
even with the small sample sizes, are those two values far enough apart to
state the difference with 95% confidence?
On another note, that site sprang 20 separate trackers/beacons/etc. in
Ghostery. I know at least one was Disqus, but 7 were different beacons, and
the slideshow itself didn't work correctly without disabling it.
~~~
olalonde
I guess the rate would also vary widely depending on which part of the city or
time of the day you drop the wallet.
~~~
danielbarla
Also, quite possibly: day of month (contrast a week before payday vs just
after), temporary local events and conditions (high unemployment one year vs
the previous), and if the wallet contained a tourist's details: the level of
2nd language knowledge predominant in the country (e.g. the old lady in one of
the first slides probably didn't speak English, meaning a barrier to contact
or an opportunity for a second individual to pocket the wallet).
Overall though, if you randomised the drops nicely, increased sample size and
repeated them a few times, it might be an interesting study.
------
cfontes
Sorry but this is almost a random test.
What neighborhoods? What days? What hours?
All that can make a difference and quite a big one.
------
nichtich
Wallet is not a good testing device. It's a common sense in a lot of people
that if you see a wallet lying around on the ground, you should never pick
that up. If you do, there will be a bunch of thugs showing up from nowhere,
claiming it's their wallet (it is) and there used to be 1000 dollars in that
wallet (there's not). It's one of the oldest tricks.
~~~
cafard
I have never heard of this trick. It must not be popular in Washington, DC.
------
zethraeus
It would be great to see this methodology taken to its logical end. Decent
sample sizes and some transparency in how drop sites were chosen, what the
wallet contents were, etc.
The measurement of this kind of empathy seems intuitively valuable. I wonder
what it would correlate with. Probably a higher average standard of living,
but maybe more homogenous societal values? A smaller population?
------
Killah911
Meh, article created to maximize clicks, not an objective study by any means.
When you have to click for the next city, that should've been a giveaway.
~~~
bowyakka
The site has according to ghostery 66 trackers on it
that's insane
------
wobbleblob
Except for London, the honest to dishonest ranking seems to equate to a
prosperous to impoverished ranking.
~~~
scw
I disagree, there doesn't seem to be a clear relationship with affluence.
Here's the list along with GDP PPP per capita at a country level [1]:
Helsinki, Finland 11/12 35771
Mumbai, India 9/12 3843
Budapest, Hungary 8/12 19497
New York, US 8/12 51704
Moscow, Russia 7/12 17518
Ambsterdam, NL 7/12 41527
Berlin, DE 6/12 38666
Ljubljana, Slovenia 6/12 27837
London, UK 5/12 36569
Warsaw, PL 5/12 20562
Bucharest, Romania 4/12 12722
Rio, Brazil 4/12 11747
Zurich, Switzerland 4/12 44864
Prague, CZ 3/12 27000
Madrid, Spain 2/12 30058
Lisbon, Portugal 1/12 23047
n.b. How about 'least honest source of pageviews' for Readers Digest on this
article?
1\. International Monetary Fund, 2012 data.
~~~
wobbleblob
I think you need to look at the City level, not country level - I know from
experience that one city in that list is a poor dilapidated hell hole in a
wealthy country - but you're right, London isn't the only outlier.
------
beautybasics
As an Indian both humbled and surprised to see Mumbai at top of the list.
------
snogglethorpe
Where's the complete list of cities they tried it in?
... or did they only try the 18 cities included in the slide show, almost all
in Europe?
~~~
function_seven
Yeah, just those 18 cities, 12 wallets in each, for a total of 192 wallets.
~~~
gordaco
16 cities, 16*12=192. The first, 18th and 19th slides did not refer to any
city.
------
riffraff
I got to the bottom of the list thinking "damn it, rome is the most
dishonest?", but it wasn't even in the list :)
------
gordaco
As anecdotic and not rigorous this is, I fully expected Madrid (or any Spanish
city) to be one of the last :(.
~~~
icebraining
As someone from Lisbon, _ouch_. But it doesn't surprise me, there have always
been plenty of pickpockets in the city, particularly in the more touristy
areas. The cops have identified more than a 1000 in the last five years.
That said, as far as I know we fare much better on violent crimes, and in
general I've rarely felt unsafe on the city, even walking alone at 3am and
later.
| {
"pile_set_name": "HackerNews"
} |
What is the use of Url shortners? - panchpunt
I want to understand the significance of Url shortning websites and short url besides their use in micro-blogging sites and advertising/campaigning.
======
sim0n
When services like Twitter limit your messages to 140 characters, you don't
want to waste 50 characters on a long URL (e.g.
<http://news.ycombinator.com/item?id=2547123>) when instead you could create a
short(er) 10 character URL instead using a service like bit.ly, etc.
------
endergen
The biggest benefit is that you can track how many people clicked on a link.
That way users know which share links were popular. And for URL shortening
sites themselves they get even better analytics where they get in real time
all the popular shared links.
------
lukeqsee
1\. Allow companies to scan content and verify it is safe.
2\. Allow companies/people to "brand" links (I guess this is advertising, to
an extent).
~~~
panchpunt
You may be right. But I'm not quite getting you. How bit.ly or
<http://bit.ly/dfMsd> will allow companies to scan content or brand links
~~~
Thomaschaaf
Just add the + at the end to see stats: <http://bit.ly/dfMsd+>
------
jesstaa
url shortener are useful for any time you need to send a link in text form.
Most sites are terrible at making good urls, and tend to make them huge and
complicated. URL shortener help with that. url shorteners are also fairly bad
for the web and should be avoided in all cases where you have another option.
------
Mz
The use I am familiar with: If you post a very long URL in an email and send
it to a group, it often will not work for the receivers because it is broken
up onto two lines. Lots of people aren't web-savvy enough copy and paste both
pieces and for those who are that savvy it's a hug nuisance. So a URL
shortener makes it much more likely that people will go read whatever you
posted by removing those issues. I shared a URL shortening service with
someone who routinely posted articles to a list we both belonged to and she
was thrilled to pieces and has used it consistently ever since.
------
gcb
goatse
| {
"pile_set_name": "HackerNews"
} |
Boost your command line productivity with "fasd" - clvv
https://github.com/clvv/fasd
======
johng
This looks supremely useful. Anyone here using it?
| {
"pile_set_name": "HackerNews"
} |
Choosing the right stack for your web application - fallenhitokiri
http://www.hopelesscom.de/2012/9/9/my_stack_is_bigger_than_yours_-_ranting_about_web_applications_and_scalability.html
======
tonecluster
I'll be a contrarian here: don't use mega-frameworks (Django, Rails). Instead,
use micro-frameworks. Stay away from ORMs (I'm looking at you, Django). A
skilled engineer can get a product comprised of small packages up and running
in the same amount of time (and sometimes less time) as one built using a
mega-framework.
(Imagine, using a python example, gEvent + zeroMQ + pystache + db-of-your-
choice. You can do almost anything.)
~~~
notJim
I feel like most skilled engineers would end up re-writing a crappy version of
a framework in the process of writing an app--so why not start with one? I
think it depends partly on whether you're really writing an app, or hacking
together an MVP.
The reason I feel this way is that most apps have a great deal of simple CRUD
(SELECT * FROM [table] WHERE id=? and UPDATE [table] SET [...] WHERE id=?),
which is exactly the problem ORMs optimize for. Any skilled engineer is going
to notice they're writing the same boilerplate SQL -> data structure of some
kind (maybe it's a dictionary type structure) code over and over again and
write a framework around that. Congratulations, you now have an ORM, except
that it's one hacked together by a busy engineer, far less tested, and missing
features that would save you writing a lot of boilerplate code.
Meanwhile, your engineer realizes that one should separate view logic from
business logic, and maybe have some glue in between, so you've got a request-
handling layer, a view (+view model?) layer, and something that handles
business logic and database access (or maybe those two things are separate,
depending on your engineer.) Congratulations, you now have an MVC stack!
Except again it's one that's far less tested, has no documentation, and has
been hacked together by your engineer, who I should mention is very busy
trying to write all this boilerplate code.
So now you've got all that taken care of, it's time to allow people to login.
Your engineer is now spending their time writing an authentication system.
Your engineer is certainly skilled, but there are a lot of nuances to get
right with this sort of thing, so your authentication system has a lot of
little security flaws and subtle bugs that you will spend the next several
months fixing.
You see where this is going...
~~~
PommeDeTerre
I think you're only looking at one side of the coin. While there may be
benefits to using ORMs and web app frameworks, there are also many significant
drawbacks. A lot of the time, these drawbacks can wipe out many, if not all,
of the gains.
You usually end up losing a lot of flexibility when using ORMs or frameworks.
If you don't do things exactly the way the ORM or framework forces you to,
you'll be in a world of pain. Of course, there will always be situations where
you need to customize things to your particular case. ORMs and frameworks are
notorious for making this far more difficult than it should be. I've seen
teams waste more time twisting a framework or ORM to handle a unique situation
than it may have saved them in the first place.
It's naive to think that ORMs and frameworks necessarily have fewer bugs, or
better quality, or better documentation, or more tests than custom-written
code. I've worked on enough teams that have had a hell of a time due to ORM or
framework bugs. It's worse when you're using a closed-source ORM or framework
that you can't easily fix directly.
Somewhat related to that, for any sizable or complex application, it's just
not possible to completely avoid understanding the inner working of the ORM or
other frameworks that you're using. Eventually you'll need to dig into them,
whether it's to fix a bug, or to figure out how to do something that's poorly
documented, or perhaps to figure out performance problems. This can be very
time-consuming, often exceeding the effort needed to write, debug and test
custom code.
If you do need to make additions or fixes to an externally-developed ORM or
framework, this can very easily cause headaches down the road when it comes to
upgrading to a new version of the framework or ORM. You end up having to
choose between using an old, patched version of the framework that lacks
critical features or bug fixes, or you can upgrade and try to re-apply your
fixes/additions, or maybe even moving to a whole new framework altogether.
Having to deal with issues like these can make custom code look very
compelling.
~~~
notJim
I'd say that the main thrust of my argument could be summarized as "frameworks
are good, you should use them to write apps." My observation is that often
when people say they don't want to use a framework, they end up trying to
write code that does similar things the framework would do, but because
they're not experts in designing frameworks, they do a bad job of it. An
example would be rather than using an ORM, people write some one-off database
service, and then copy-and-paste and find-replace to write another one. This
leads to a great deal of code duplication, and makes it difficult to fix bugs
and fix performance issues (e.g., to implement a read-through cache, you'd
have to go through each copy-pasted variant and implement the cache, rather
than implementing it once in a base class.)
It's not that these people are writing a better framework that adapts more to
their needs, it's that they read on the internet "don't use frameworks" and
decided they were smarter than all those other startups that use Rails. The
result is generally a pile of half-complete, poorly-conceived ideas centered
around decisions that no longer make a lot of sense.
I say all this having used, seen and maintained many a homegrown framework in
my time as a contract developer. I'm sure they're out there, but I've never
seen a codebase that doesn't use a pre-existing framework that's better than
one that does. I think if you work on a small team, it is very difficult to
take the time to write a well-designed and well-tested framework _in addition
to_ your main responsibility, which is presumably to make features that users
are dying to give you money to use.
You are definitely right that you can paint yourself into a corner with
frameworks. I suppose my theory is that as long as you're using an open-source
framework, you will always be able to code your way out of whatever situation
you end up in. I also believe (but obviously cannot prove) that in nearly all
of these situations, the net effect of using a framework will be a positive in
terms of both your code quality and your productivity. It goes without saying
that if you choose to use a crappy framework, then you're going to have
problems.
~~~
tonecluster
To be really clear -- I didn't write "don't use frameworks".
I will happily say "Don't use megaframeworks; use microframeworks.". And, I'll
add "read the code in these microframeworks, learn what they do and how they
do it, and start to see ways that the different tools available to you can be
assembled".
I also absolutely agree that one should not create your own (mega)framework in
order to build your application. But luckily I never said that one should ;)
And, I agree with your last paragraph and can add to it: "I suppose my theory
is that as long as you're using an open-source framework or collection of
microframeworks,..."
In my own experience, over time the use of small, clean open-source libraries
(the microframeworks) beats the hell out of the codebases built on the
megaframeworks. Eventually, you have to replace something, and replacing one
component among many is a hell of a lot quicker, easier, and efficient than
plumbing one out of a larger, less simple, monolith.
------
programminggeek
Here's a shortcut, pick the one you and your team knows well enough to get the
job done as fast and as reliably as possible. Programmer productivity beats
technical sophistication every time.
Also, if you're worried about "scaling", know that almost all web app scaling
issues are related to caching strategy, database queries, database structure,
and using a queue where appropriate. Rails vs Django vs node vs PHP Framework
X vs Java/Scala vs .NET won't change those problems for you.
~~~
ibejoeb
Choosing your instruments is as much of an engineering problem as the problem
itself. Whatever you wind up with, make sure you arrived there based on some
reasonable criteria. You'll have some give-and-take:
* Is my team productive with it?
* Do the components have an acceptable level of integrity and maturity?
* Is it fashionable enough, or at least not too unfashionable, so that it does not become _the_ deciding factor for anyone who might become involved?
------
treenyc
Helma.org
| {
"pile_set_name": "HackerNews"
} |
Ask HN: What do you do to enhance/protect your privacy online? - hajderr
E.g.<p>* incognito always
* tor browser
* switched X app for Y<p>...<p>For me it's on a basic level:
incognito mode
block cookies as much as I can
======
Nextgrid
No accounts with any kind of social media or ad-supported service except if
required for my work (I do need LinkedIn unfortunately). They still track me
but at least they nave no legal basis for doing so (I haven't accepted their
ToS/privacy policy) and don't have a persistent account to associated my
actions with, so they have to infer who I am through cookies or fingerprinting
which doesn't have perfect accuracy (even less so with countermeasures).
Private browsing all the time to automatically clear cookies and not share
sessions between tabs (unfortunately there isn't a better way to automatically
clear cookies with Safari, this is the best solution I know of).
AdGuard as a content blocker with the majority of the blocking lists enabled
(not as good as uBlock Origin, but better than nothing).
No Android (obviously).
Preferring built-in apps on mobile (iOS) rather than third-party alternatives
as the majority of them have spyware and malware like analytic SDKs or the
Facebook SDK in them.
Using the GDPR to try and force companies to be more privacy-respectful. There
are certain local businesses I want to use but can't because of malicious SDKs
in their apps so I try to use the GDPR to force them to provide a toggle or
outright remove them and escalate to the regulator if necessary.
~~~
hajderr
Thanks, too bad this topic was severely down rated so I'm not sure this will
get more comments :)
| {
"pile_set_name": "HackerNews"
} |
Ways to Learn to Program Online - iProject
http://thenextweb.com/dd/2012/10/21/so-you-want-to-be-a-programmer-huh-heres-25-ways-to-learn-online/
======
jasim
Check out <http://rubymonk.com> as well for learning Ruby interactively. We've
a free Ruby Primer series that covers the basics of Ruby and a paid
subscription with advanced books. The library is constantly updated with new
content.
------
slap_shot
Perhaps the 'Eloquent JavaScript' point should actually refer to
<http://jsbooks.revolunet.com> and mention Eqloquent as a first in a series of
great resources.
------
rodolphoarruda
I stopped reading at the "W3 Schools" topic.
~~~
wyclif
Yes, they shouldn't recommend that. Instead, try W3Fools:
<http://w3fools.com/>
Many of the other recommendations are just great, though.
------
keithpeter
I'd like to be able to write end-user code and 'glue' scripts easily. I don't
want to be a 'developer' especially.
I'm piecing it all together from bash and perl tutorials and some R code sites
at present. There is a bit of cut and paste going on... not good.
I need to scratch my own itches to get the motivation. What I perpetrate will
never see a server, I promise.
Any suggestions?
------
shawndrost
BLATANT SELF-PROMOTION! For those of you that are sick of getting stuck and
demotivated when learning online, my school for programmers opened up
applications last week.
<http://catalystclass.com>
~~~
korussian
I live in South Korea and can't get to San Francisco for a 12-week stretch.
But you have a tremendous idea here and I'd love to be a student.
If you could compress the in-class time to 4 weeks and let me do the rest
remotely, and offer a price-point at roughly one-month of my salary or less,
I'd be all over this.
------
stfu
I wish there was a more affordable alternative to bloc.io. The idea is really
great, this mentee - coach relationship should help a lot in sticking with it
and having a rapid learning curve.
------
finin
This makes me think of Peter Norvig's "Teach Yourself Programming in Ten
Years". <http://norvig.com/21-days.html>
~~~
cciesquare
I really like the 10 year plan. It might seem a long time, but if four of
those years are college that's about half, and if you do a masters program
that cuts it down even more.
Personally if you do four years as a CS major but program on the side or
actively practice programming it's more like 5-6 years. That gives you 4
years, where you go out and get programming work experience, in the real
world. In that situation, 10 years isn't much.
------
jdaudier
New kid on the block to learn Ruby, Python, and JavaScript:
<http://www.learnstreet.com>
------
coreypnorris
Forgot teamtreehouse.com
------
dschiptsov
/s/Program/Code/
~~~
klibertp
Maybe, but I feel that learning to code is a prerequisite to learning to
program. Or I should say: one of possible prerequisites, just the most fun.
| {
"pile_set_name": "HackerNews"
} |
Wikipedia bans agenda-driven editor, but punishes the messenger too - k1m
https://wikipedia.fivefilters.org/banning/
======
bhouston
Wikipedia internal solidarity politics are just messed up if you approach the
inner circle. There is no fixing it. There is a lot of behind the scenes back
scratching, horse trading, and manipulation via back channels between the
members. Just stay away from them and avoid their gaze. It has also been like
this, it isn't new nor does it mean there is impending doom for Wikipedia. It
is just how it is. It is like the black whole at the center of the galaxy, you
want to just stay away from it.
~~~
cm2187
Well, some people care about an organisation's internal politics. That's
basically what the "delete uber" movement is, otherwise why would the clients
care how they treat women internally?
But in the case of Uber, the internal company politics don't really affect my
ride from point A to point B. The editorial policy within Wikipedia however
may have an impact on the content I read. Particular on politically sensitive
topics, it is important to keep a critical mind and be aware of the intentions
of the authors.
~~~
majormajor
Seems like an awfully big jump. You can have a crapload of internal politics
without harassment, sexism, or anything like that. You'll probably also get
some of those in any sufficiently large group of people, but it's possible to
respond to them well _and still be a political minefield_ in terms of
interpersonal relationships and power games.
~~~
ethbro
Avoiding situations like this has swung me around to the profound organization
health effects of regularly shuffling director+ level personnel (e.g. every
2-4 years).
What you lose in efficiency (as they refamiliarize themselves) -- you more
than gain in healthier politics, cross-organization cooperation and knowledge,
and overall following of rule of law / documentation.
~~~
Gibbon1
Reminds me of chatting with a director level manager at a fortune 500 firm. He
was on vacation between assignments. When I asked why he was being reassigned
he said the company moves them around every 2-3 years.
Another example friend lives in a small rural town. The county sheriffs
department rotates deputies every two years. Because the deputy needs to be
willing to arrest respected members of the community for drunk driving and not
be taking bribes from the local meth dealer.
~~~
jhayward
> _Reminds me of chatting with a director level manager at a fortune 500 firm.
> He was on vacation between assignments. When I asked why he was being
> reassigned he said the company moves them around every 2-3 years._
This is a healthy practice in larger organizations. Not only does it avoid the
concretion of personal politics but it exposes managers to a breadth of
experience, which is useful if they are promoted. It also ensures that the
organization has stretched its adaptation muscles regularly, and is not
brittle in the event of personnel turnover.
~~~
Gibbon1
One of the issues I found with the first company I worked at was because there
was little turnover there wasn't a lot of horizontal knowledge transfer that
you get at more dynamic companies. Meant their processes, technologies, etc
were stale and getting more stale as time went on. Which is why I left.
Best thing that happened to them was they let one of my utterly burned out
coworker go, followed by exit of myself and another more sr engineer. Which
forced them to hire three outsiders within a short span.
------
sverige
The idea that _anyone_ (including me and you, dear reader) is free from bias,
especially when discussing politics, seems painfully obvious after reading
lots of history. And yet, the claim that this or that person or organization
is 'neutral' is very common.
Further, the desire to create a 'neutral' 'encyclopedia' online is presumably
what drives the entire Wikipedia community. I'd wish for a Wikipedia that
consistently allows all viewpoints to be given an opportunity to be presented
on any given topic, but I have no faith that such a thing is possible given
the pervasiveness of the idea of neutrality and the interests at work to
preserve that myth.
~~~
throwaway5752
That is nihilism, plain and simple.
You can recognize bias and try to avoid it, like Wikipedia, by requiring
credible citations and having processes to avoid editorial bias from impacting
the end result (which it obviously did, eventually, as we're reading about
here). Like CNN, the WSJ, NYT, Bloomberg, WP, and many other sources of
information (across the political spectrum) do every day.
Or you can go the route of Fox news, take a pronounced editorial bias that
takes clear precedence over reporting facts. I would post video of Jeanine
Pirro or Hannity, but it seems unecessarily crude.
~~~
SlowRobotAhead
>having processes to avoid editorial bias from impacting the end result
>like CNN, WP, NYT
Um... really? Are you saying those three specifically aren’t putting out very
biased results? You could be left of Marx and know that isn’t true for those
three specifically. Esp CNN!
The only explanation here is a different topic, that you yourself are so
biased that you think CNN is putting out good journalism.
~~~
cm2012
CNN, WP, and NYT basically do just present the facts in their reporting. To
the extent that it's a big deal when they put out a falsehood and people have
gotten fired. Opinion sections are obviously different.
~~~
monocasa
At least during the 2016 primaries, they were basically just mouthpieces of
the Clinton campaign.
~~~
cm2012
That's just not true.[https://fivethirtyeight.com/features/how-donald-trump-
hacked...](https://fivethirtyeight.com/features/how-donald-trump-hacked-the-
media/)
~~~
monocasa
[https://www.salon.com/2016/11/09/the-hillary-clinton-
campaig...](https://www.salon.com/2016/11/09/the-hillary-clinton-campaign-
intentionally-created-donald-trump-with-its-pied-piper-strategy/)
The Clinton campaign specifically asked the news media (including CNN, WP,
NYT) to increase coverage of Trump (as well as Carson and Cruz). This is so
that they'd either have someone easier to beat in the case of these 'pied
piper' candidates, or pull a stronger opponent farther right come the general.
Nate Silver didn't know about this in March of 2016 when he wrote that
article.
~~~
SlowRobotAhead
This is the most inconvenient truth of the entire 2016 Campaign. Hillary
requested Trump to be propped up as someone she "knew" she could beat. If she
had just played fair, she might have actually won against Jeb who the GOP
actually wanted.
Don't like Trump? Thank Hillary Clinton, the DNC, and the media for playing
along.
------
kevin_b_er
Wikipedia is like a microcosm of public politics, large version of a high
school social system, or maybe a cult.
There's cliques, arcane rules, and always always an agenda. It would make for
a fascinating study on human behavior and how it relates to politics in the
real world. Wikipedia internal politics are a scary beast fraught with peril.
And the actual content suffers horribly.
~~~
convery
Indeed, it's always fun to read about a controversial topic one day and find a
totally different article the next. Then watch the talk where some news-
sources are unacceptable when they support the wrong narrative but
unsubstantiated blogs can be used to push the right one. It's fascinating
although a little scary.
~~~
cheeko1234
Any examples?
------
wavefunction
I think the most productive move that folks critical of Wikipedia can make is
to create an alternative. It's unfortunate that Jimmy Wales and his clique
comport themselves the way they do, because Wikipedia would be much stronger
if they didn't.
I find it useful for browsing on subjects that interest me but I don't tend to
use it for subjects that are too controversial. At least superficially. And
yet given what I know about the biases of Internet denizens I have to wonder
how much of what I expose myself to is accurate and not the result of some
weird personal biases.
I also consider the citations listed in the footnotes which gives me a bit
more confidence than none, I suppose.
~~~
KaoruAoiShiho
How would any alternative be any less biased?
~~~
chris_wot
I had an idea that you could create a Wikipedia with the same content, but
allow for certain, more opinionated alternative articles. A link to these
would be incorporated into the UI of the site and by default the best, most
neutral one would be shown but you would suspend neutrality on the alternative
articles. All the other Wikipedia policies would still apply.
~~~
JCharante
For example, Wookiepedia shows the cannon articles by default, but at the top
you're able to select the Legends version of the article, which uses sources
that were retconned. Just replace Legends with a different name/bias, and add
multiple tabs.
~~~
chris_wot
Precisely my idea!
------
Kim_Bruning
Fivefilters.org is very good at cherry-picking; they are quite ...selective...
with the truth. Take everything they say with a barrel of salt, and keep your
eyes peeled for what they _leave out_.
Philip Cross cannot be said to be an agenda driven editor; people checked all
the evidence (One person went so far as to hand-check and score large samples
of his edits; a hellish job!). The facts just weren't there to support that
assertion.
KalHolmann was ultimately NOT the messenger here, and was also not the person
who brought the arbitration case. In fact, when KalHolmann was explicitly
invited to help sort things out, they couldn't back out fast enough. What
little KalHolmann _did_ do brought more heat than light to the situation.
~~~
k1m
> Philip Cross cannot be said to be an agenda driven editor; people checked
> all the evidence (One person went so far as to hand-check and score large
> samples of his edits; a hellish job!). The facts just weren't there to
> support that assertion.
Philip Cross edits in different areas, e.g. jazz and film. He's highly
prolific. He needn't have an agenda in all areas to be called an agenda-driven
editor in other areas. And the fact that he's made over a 100,000 edits means
someone who approaches an examination of his edits by looking at a small
random sample of his total output (as the editor you mention did) is not
likely to come up with anything useful (I think this was pointed out to him).
So to say "the facts just weren't there to support that assertion" is not
really true. We've documented his editing in three articles now:
* [https://wikipedia.fivefilters.org/](https://wikipedia.fivefilters.org/)
* [https://wikipedia.fivefilters.org/agenda.html](https://wikipedia.fivefilters.org/agenda.html)
* [https://wikipedia.fivefilters.org/evidence/](https://wikipedia.fivefilters.org/evidence/)
Readers can make up their own minds.
> KalHolmann was ultimately NOT the messenger here, and was also not the
> person who brought the arbitration case.
Who was the messenger, then? As far as we're aware, he was the Wikipedia
editor who attempted to notify the community to the problems when the story
emerged. Only to be shut down by admin Guy - who, it should be mentioned,
appears to be an acquaintance of yours:
[https://en.wikipedia.org/wiki/User_talk:JzG/Archive_155#meat...](https://en.wikipedia.org/wiki/User_talk:JzG/Archive_155#meatball:ExpandScope_wrt_User:Philip_Cross)
> In fact, when KalHolmann was explicitly invited to help sort things out,
> they couldn't back out fast enough. What little KalHolmann did do brought
> more heat than light to the situation.
He withdrew because of the way he was treated by Wikipedia admins, including
Guy (your friend/acquaintance), and the arbitrators who started censoring his
contributions. Anyone can read his contribution to the case here:
[https://en.wikipedia.org/w/index.php?diff=845422299&oldid=84...](https://en.wikipedia.org/w/index.php?diff=845422299&oldid=845421753&title=Wikipedia%3AArbitration%2FRequests%2FCase%2FBLP_issues_on_British_politics_articles%2FEvidence&type=revision)
and the edit history which shows editing of his contributions (mostly
permanently deleted now by the arbitrators):
[https://en.wikipedia.org/w/index.php?title=Wikipedia:Arbitra...](https://en.wikipedia.org/w/index.php?title=Wikipedia:Arbitration/Requests/Case/BLP_issues_on_British_politics_articles/Evidence&offset=&limit=500&action=history)
> Fivefilters.org is very good at cherry-picking; they are quite
> ...selective... with the truth.
You've not really demonstrated that with your contributions here.
~~~
Kim_Bruning
I am basing myself on those pages you mentioned. I already communicated to you
about some of the omissions I noticed before. I am explicitly cautioning
readers who read those pages that I noticed a lot of unbalanced omissions.
They can then be on the lookout for that and indeed check and decide for
themselves.
~~~
k1m
I don't think your earlier comments contained much substance, but were more an
attempt to deflect and defend a Wikipedia admin with whom you are perhaps
acquainted. But I will link to those earlier comments below for the interested
reader. We've had three articles about this story discussed on Hacker News,
including this one. Here are your comments on the previous articles, and my
replies:
* A Wikipedia editor's long-running campaign: [https://news.ycombinator.com/item?id=17109849](https://news.ycombinator.com/item?id=17109849)
* Update: The agenda-driven edits of Philip Cross and Wikipedia's response: [https://news.ycombinator.com/item?id=17169786](https://news.ycombinator.com/item?id=17169786)
~~~
Kim_Bruning
I'm familiar with a lot of wikipedia editors. That doesn't mean I agree with
all of them all at once. (Even if that was possible, which it definitely isn't
:-P )
As you observed: after reading your first story, I managed to convince JzG to
at least consider changing tack wrt Kalholmann. Which they did!
Unfortunately Kalholmann was a very bad choice for a champion. In fact, they
decided to post Personally Identifying Information right in full view of the
arbitration committee; even after being advised not to! (this behavior is at
best unethical, arguably illegal, but definitely against the rules!)
In practice, the ironic fact is that JzG turned out to be more effective for
your cause than Kalholmann!
I guess I'm just surprised that you're still sticking with your old story,
even despite the added clarity of 20/20 hindsight.
You appear to be rather prone to cherry-picking. I'm not sure whether this is
by accident or on purpose. (I'm also not sure if it matters; either way it
causes harm!)
[https://en.wikipedia.org/wiki/Cherry_picking](https://en.wikipedia.org/wiki/Cherry_picking)
~~~
k1m
> Unfortunately Kalholmann was a very bad choice for a champion. In fact, they
> decided to post Personally Identifying Information right in full view of the
> arbitration committee; even after being advised not to! (this behavior is at
> best unethical, arguably illegal, but definitely against the rules!)
Care to provide some evidence?
Worth pointing out that Philip Cross never claimed to be using a pseudonym. In
fact he claimed the opposite, right on Wikipedia: stating that he wasn't using
a pseudonym. He linked his Twitter account to his Wikipedia account. None of
this was exposed due to sleuthing/doxxing - it was simply what he himself
stated. So there was understandably a lot of confusion around posting
personally identifiable information (ie. his name) when he himself claims on
Wikipedia that that is what he's called and he's not operating under a
pseudonym.
Also, did you see Guy/JzG's own doxxing effort?
[https://en.wikipedia.org/wiki/Wikipedia:Arbitration/Requests...](https://en.wikipedia.org/wiki/Wikipedia:Arbitration/Requests/Case/BLP_issues_on_British_politics_articles/Workshop#Intimidation_campaign)
(he outed someone and another admin had to revision delete his outing).
Curious that you don't consider that a big offense, but are happy to accuse
Kal Holmann of something without providing any links.
> In practice, the ironic fact is that JzG turned out to be more effective for
> your cause than Kalholmann!
It's hilarious that you're still attempting to defend JzG. How can we know he
was more effective? We can't see how this would have turned out had JzG not
interfered. So why speculate? But there's plenty he did wrong, and faced no
consequences for, which we are attempting to highlight.
> I guess I'm just surprised that you're still sticking with your old story,
> even despite the added clarity of 20/20 hindsight.
You've done little to bring clarity here.
~~~
Kim_Bruning
Right, so JzG clearly had it in for you. And yet somehow their actions ended
up getting the exact result you were hoping for (as reported by you).
In the mean time Kalholmann was your best friend ever... and somehow every
time they tried something, they got themselves in more trouble and ultimately
got sanctioned by the arbitration committee. (also as reported by you)
Yup, right, clearly your narrative makes perfect sense here. Good job, well
done. ;-)
( Just because someone doesn't immediately prostrate themselves and profess
their undying loyalty to your cause, that doesn't mean they can't effectively
be your friend-du-jour. Vice versa, just because someone does _say_ they
support you doesn't mean they're automatically competent and able to help.)
~~~
k1m
> Right, so JzG clearly had it in for you. And yet somehow their actions ended
> up getting the exact result you were hoping for (as reported by you).
We never said JzG had it in for us. We highlighted problems with his conduct
in relation to the case. And what we consider the double standards of the
arbitration committee when it came to certain parts of their decision.
> In the mean time Kalholmann was your best friend ever... and somehow every
> time they tried something, they got themselves in more trouble and
> ultimately got sanctioned by the arbitration committee. (also as reported by
> you)
Your comment here suggests that you think the arbitration committee can do no
wrong. We're highlighting what we think is unfair treatment in this case (with
supporting evidence). The fact that the story got voted to the front page of
Hacker News shows that many others find it convincing too. The many comments
here also show that people have had their own experiences of unfair treatment
participating in Wikipedia.
> ( Just because someone doesn't immediately prostrate themselves and profess
> their undying loyalty to your cause, that doesn't mean they can't
> effectively be your friend-du-jour. Vice versa, just because someone does
> say they support you doesn't mean they're automatically competent and able
> to help.)
You're stating the obvious here, but we don't need to talk about
hypotheticals. We can examine the conduct of everyone involved. We've tried to
support our position by showing evidence of what we consider is unfair
treatment. I haven't been convinced otherwise by your contributions, but maybe
others will be.
------
gadders
Some of the back story to this is an argument between journalists Oliver Kamm
and Neil Clark.
Not picking any particular side, but there seems to have been some real life
disagreement that is playing out on Wikipedia.
[1]
[http://oliverkamm.typepad.com/blog/2006/11/neil_clark.html](http://oliverkamm.typepad.com/blog/2006/11/neil_clark.html)
[2] [https://www.craigmurray.org.uk/archives/2018/05/the-
philip-c...](https://www.craigmurray.org.uk/archives/2018/05/the-philip-cross-
affair/comment-page-1/)
[3] [http://neilclark66.blogspot.com/2016/10/a-sign-of-times-
vici...](http://neilclark66.blogspot.com/2016/10/a-sign-of-times-vicious-
vendettas-of.html)
------
chris_wot
My initial reaction to this was that this was typical. I was similarly
treated. But frankly, I know Guy and something doesn’t seem quite right.
I’m not certain we have heard the full story on this.
One thing I will say: I often regret creating the admin’s noticeboard. It
helped centralise control and the incidents offshoot is a cesspool of
conflict. It often is not managed well at all.
~~~
mirimir
I gotta say that, after reading this,[0] I am very glad that I never started
contributing to Wikipedia.
0)
[https://en.wikipedia.org/wiki/Wikipedia:Administrators%27_no...](https://en.wikipedia.org/wiki/Wikipedia:Administrators%27_noticeboard/Archive299#Philip_Cross)
~~~
lucideer
I don't know Guy, but purely from that linked discussion (Guy on George
Galloway):
> _He is without question a controversial figure, and not in a good way_
That is, without any need for further context, a statement of bias, and if the
user Guy is espousing such views of relevant subjects here they absolutely
shouldn't be involved in decision-making of any kind on this topic. This
follow-up statement is worse again:
> _... and allow PC to definitively clear his name_
Presumption of innocence is one thing, but presumption of a future outcome of
arbitration is quite another.
It's pretty clear where Guy stands on this from the outset, and it's not good.
~~~
lobotryas
Are you saying that people can't be controversial in "not a good way"? I hope
not.
Whether this applies to Galloway is a matter of popular opinion. Look at
Assange. First he was hailed as a hero in the West; now he's looked at as a
Russian agent.
~~~
lucideer
> _Are you saying that people can 't be controversial in "not a good way"? I
> hope not._
Of course not. I'm just saying that judgement of whether it is in "a good way"
or "not a good way" will always be subjective, and representative of a biased
viewpoint. Noone is completely unbiased but espousing such a viewpoint
publically on a thread about arbitration isn't in keeping with
responsibilities of Wikipedia's admins.
> _Whether this applies to Galloway is a matter of popular opinion._
The popularity of the opinion should not be a qualifier for it to make up a
part of a Wikipedia aministrator's discourse on non-biased arbitration.
------
chris_wot
Tweet from Jimmy Wales:
[https://twitter.com/jimmy_wales/status/994264792062885895?s=...](https://twitter.com/jimmy_wales/status/994264792062885895?s=21)
~~~
dmix
Sounds like there's a lot more to this story than it seems at first...
I really hope this is not based in some right vs left wing political drama and
is legitimately about abusive behaviour.
Listening to the BBC show (as I'm not very familiar with this story), it
sounds like it's a lot of one and a little of the other
[https://www.bbc.co.uk/programmes/w3csws6q](https://www.bbc.co.uk/programmes/w3csws6q)
~~~
lobotryas
Certainly feels like someone found a reason to kick a right-leaning admin off
Wikipedia. Just because someone has an agenda and seek out negative
information about people, doesn't mean said information is wrong.
~~~
phkahler
But it does mean they have an agenda.
------
kaendfinger
Cue the people trying to claim that anyone in this entire world is unbiased.
I'm biased, you're biased, everyone's biased. CNN is biased, NYT is biased,
Fox News is biased, NPR is biased. We should all stop pretending that our own
news source is unbiased. It's simply not true.
~~~
manfredo
Sure, but a reputable source actively tries to mitigate bias. Not to mention,
Wikipedia is supposed to be a repository of factual information as opposed to
news publications that often deliberately focus on their own opinions rather
than facts - not that editorials are inherently bad, just that news
publications are distinct from encyclopedias.
------
FlashGit
English wikipedia on various aspects of 'British' history is pretty biased and
prejudiced. There is the typical careful framing to tilt perception positively
and negatively as required, such a desperate bunch.
------
sodosopa
I guess I won't donate this year.
------
tomtimtall
This likely won’t have huge concequences for Wikipedia, but they lost at least
one donor.
------
holstvoogd
ugh, i think it is time to scrap the internet. everything is being corrupted
to the extend there is no point imo
~~~
kypro
I really miss the web from 10 years ago. It felt so liberating back then.
Everything felt so open and genuine. I made so many friends in the early days
of the internet. I don't even know how, forums, MSN messengers, games. You'd
just add people and talk to them for hours.
Now the internet today seems to be mostly ads, corporate corruption,
government censorship, and political outrage. It's a shame. I don't enjoy the
internet half as much anymore. Maybe it's more my age or that I'm used to the
technology now, but it really feels like we lost something innocent and
special about the early days of the internet.
~~~
maze-le
I don't know how old you are, but my bet is: its an age thing.
Actually I miss the web from 1999-2004 for the exact same reasons you do, and
2008 was already way downhill for me. But when I think about it a bit more
carefully... How old was I at the time? 14-19... An age where almost
everything is exciting and new: music, parties, friends, philosophy,
technology you name it. I can easily apply this sentiment to most categories
of interest during this time period.
I wonder if in 15 years from now, the people around 30 will rumble about how
wonderful it was back then, when it was all the rage to be on Instagram etc.
and how easy it was to connect to people, and how quickly you could organize
events via social media.
~~~
stallmanite
To add more anecdotes to the fire I definitely find myself pining for the
97-2000 days. Slashdot, goofy Geocities pages, good times.
------
_louisr_
Everything looks good here. The biased Wikipedia editor can no longer edit
pages on post-1978 British politics. The whistleblower can no longer speculate
on any editor's off-wiki behaviour.
No more biased edits from the accused, and no more witch-hunting from the
whistleblower.
Relevant quotes from article:
Philip Cross is indefinitely topic banned from edits relating to post-1978
British politics, broadly construed. This restriction may be first appealed
after six months have elapsed, and every six months thereafter
KalHolmann is indefinitely restricted from linking to or speculating about
the off-wiki behavior or identity of other editors. This restriction may be
first appealed after six months have elapsed, and every six months thereafter
Well done Wikipedia.
~~~
lucideer
> _no more witch-hunting from the whistleblower._
I think the issue raised is the definition of Holmann's actions as
whistleblowing or witchhunting.
If it's the latter, that would seem to imply he shouldn't have engaged in that
witchhunt, in which case the ban on Philip Cross would definitely not have
come to pass. Are you saying this should be the case?
If it's the former (whistleblowing), it can't be defined as witchhunting and
you must agree Holmann should not be restricted.
You can only agree with one of the above and remain logically consistent.
~~~
Kim_Bruning
Kalholmann didn't open the arbcom case, and they didn't provide the bulk of
useful evidence. I'd say they were neither whistleblowing nor witchhunting;
just producing more heat than light (sadly).
~~~
k1m
KalHolmann's post about Philip Cross' conflict of interest was posted on 18
May 2018 on the Wikipedia Administrators' noticeboard. It ended with the
following proposal:
> I request that Philip Cross be topic banned from editing George Galloway and
> the other "goons" with whom he is at war—Matthew Gordon Banks, Craig Murray,
> Nafeez Mosaddeq Ahmed, Tim Hayward (academic), Piers Robinson, and Media
> Lens— all of whose Wikipedia pages Cross has frequently edited. KalHolmann
> (talk) 21:04, 18 May 2018 (UTC)
Five minutes after posting that, your acquintance Wikipedia admin Guy rejected
it, as he'd done the previous attempt (also within minutes).
[https://en.wikipedia.org/w/index.php?title=Wikipedia:Adminis...](https://en.wikipedia.org/w/index.php?title=Wikipedia:Administrators%27_noticeboard/Incidents&direction=next&oldid=841905271#User:Philip_Cross_has_COI)
As it turns out, two months later, the arbitration committee essentially ruled
agreeing with his initial request (and in fact broadening the ban to all of
post-1978 British politics).
So your claim that he was "producing more heat than light" is nonsensical.
~~~
Kim_Bruning
Be that as it may initially; ultimately JzG relented, opened an Arbcom case,
and explicitly invited KalHolmann to put forward their position.
One of KalHolmann's key actions in response was to seek to be removed from the
case entirely!
[https://en.wikipedia.org/wiki/Wikipedia:Arbitration/Requests...](https://en.wikipedia.org/wiki/Wikipedia:Arbitration/Requests/Case/BLP_issues_on_British_politics_articles/Workshop#Remove_KalHolmann_as_party_to_this_case)
Do you deny that this happened? Would you put forward the position that
KalHolmann changed his mind later? Or (checking the record) do you see that
mostly other people took over and examined the case instead?
~~~
Kim_Bruning
\-- seeing your other comments, I do see some timeline issues in the above
statement which I need to doublecheck.
(+edit): Ah, here's the first statement by KalHolmann on 26 may, which is a
bit more ambiguous.
[https://en.wikipedia.org/w/index.php?title=Wikipedia:Arbitra...](https://en.wikipedia.org/w/index.php?title=Wikipedia:Arbitration/Requests/Case&diff=prev&oldid=843054921)
It could just be that KalHolmann isn't sure of their footing: It's still not
the most brilliant of openings in building a case against Philip Cross of
course. I was disappointed.
~~~
k1m
I can't speak for Kal, but if you examine his actions from the start (I quoted
his 18 May 2018 post, which precedes the one you linked, here it is again
[https://en.wikipedia.org/w/index.php?title=Wikipedia:Adminis...](https://en.wikipedia.org/w/index.php?title=Wikipedia:Administrators%27_noticeboard/Incidents&direction=next&oldid=841905271#User:Philip_Cross_has_COI))
it appears he simply wanted to notify the community of Philip Cross' conflict
of interest and to suggest a topic ban to prevent Cross from editing pages of
people he had a conflict of interest with. There is nothing to indicate he
wanted to open an arbitration case, nor be party to one. So that all occurred
because of Guy (JzG) who after blocking all of Kal's attempts to start a
discussion about this, finally started one himself in which he misrepresented
the issue (by making the whole dispute to be about Galloway) and falsely
accused Kal of being a supporter of Galloway, among other baseless
accusations.
In light of that, Kal Holmann's statement which you linked above makes perfect
sense. He was dragged into an arbitration dispute by a Wikipedia admin
(Guy/JzG) who you seem very keen to defend, and then posted a statement
denying the accusations levelled against him. Wikipedia arbitrators end up
punishing Kal but say not a word about Guy/JzG's actions.
~~~
Kim_Bruning
Well, if Philip Cross's alleged conflict of interest were to be examined and
dealt with, ultimately it would need to be looked at by the arbitration
committee. Once you understand that, other things start falling into place.
JzG changed tack because I poked them and pointed them at your story.
JzG went back to admin's noticeboard to gather further input, and people
confirmed that an Arbcom case would indeed have merit.
At that point, the Arbcom case became inevitable.
Now, as an honorable human being, you can't go and start something, and then
when it happens turn around and get cold feet.
I'm a bit disappointed in KalHolmann because they ran away when things Got
Interesting.
I'm also disappointed in you because your stories turned out to actually not
be as well researched as they seemed at first blush.
I now regret my own part in this :-/
~~~
k1m
> Well, if Philip Cross's alleged conflict of interest were to be examined and
> dealt with, ultimately it would need to be looked at by the arbitration
> committee. Once you understand that, other things start falling into place.
Not true. The community is more than able to reach a decision without
arbitrators. As was done when they voted to topic ban him from George
Galloway.
> JzG changed tack because I poked them and pointed them at your story.
That doesn't excuse or explain his shutting down of Kal Holmann's initial
report of conflict of interest - there was plenty in there deserving of
discussion whether one looked at our story or not. And part of Guy/JzG's
'changing tack' was to misrepresent the issue by making it out to be a dispute
mainly between Philip Cross and George Galloway, confusing other editors in
the process. Something that the arbitrators also failed to comment on in their
decision, but appeared to be aware of because they re-titled his arbitration
case from "George Galloway" to "BLP issues on British politics articles" and
later expanded the scope of the ban even more to "post-1978 British politics".
Quite a jump from Guy's preferred focus of just Geroge Galloway.
> Now, as an honorable human being, you can't go and start something, and then
> when it happens turn around and get cold feet.
You're not making sense. Earlier you wrote "KalHolmann...was also not the
person who brought the arbitration case." So why are you now saying he started
it and then got cold feet?
The fact is he didn't start it. Guy/JzG requested it and dragged him into it
with false accusations. I started to participate in the evidence phase of the
arbitration case myself and was alarmed at the way the arbitrators were
treating Kal so I withdrew too. The rest of the evidence we published on our
own site. I won't rehash what I've already said in my other replies to you
here regarding this, they're easy enought to find. You seem determined to
misrepresent this story without providing any useful evidence. Bit rich to be
accusing us of "cherry picking".
~~~
Kim_Bruning
The wikipedia community disagreed with you there. The consensus was in favor
of sending this to the arbitration committee, with the topic ban being a
temporary first fix. And it did go to arbcom, and arbcom did take action.
On the one hand, JzG is obviously not your yes-man; this particular person has
a mind of their own and presented the case from their own perspective. That
said -on balance- their behavior came out in your favor. (as is abundantly
clear now that we have 20/20 hindsight)
Kalholmann was given every opportunity to set/correct the record from their
own perspective, but they just didn't take it. Hence my dissapointment.
You and I are working with the same set of evidence. I guess the main
difference (if even that) is that I'm not just reading the partial account on
your site, but I am also reading directly from the primary source including
the bits that your team have left out.
Obviously our interpretation differs. ;-)
I guess only you yourself can know whether you are deliberately spinning
things; or whether you are genuinely personally convinced by the narrative
that you present.
~~~
k1m
> The wikipedia community disagreed with you there. The consensus was in favor
> of sending this to the arbitration committee, with the topic ban being a
> temporary first fix. And it did go to arbcom, and arbcom did take action.
You miss the point. You wrote "if Philip Cross's alleged conflict of interest
were to be examined and dealt with, ultimately it would need to be looked at
by the arbitration committee." I was pointing out that that is not in fact
true. The Wikipedia community can make decisions without referring everything
to the arbitration committee. I wasn't expressing an opinion about whether
this case should or should not have gone to the committee.
------
dingo_bat
Agenda driven editor sounds exactly like the kind of editors you need for an
encyclopedia. If you have no agenda, won't you just sit at home and watch TV?
------
baud147258
So this website is going to lose its raison d'être? Also his soft-grey on
white background is hard to read.
Edit: It's just in my browser of choice that it appears as grey on white. It
looks much better on Chrome.
~~~
pc86
What is your browser of choice and why is it displaying websites incorrectly?
~~~
baud147258
IE.
I don't think I need to explain why some website might not display correctly
with it, even if most website are displaying correctly. But I haven't look why
this one in particular is not working.
~~~
Cthulhu_
I'm no expert, but I guess it might be a font that renders differently on IE
vs Chrome / other browsers - or it's a font not supported and what you're
seeing is a less legible fallback.
~~~
baud147258
Well it's not a font problem, the issue is that the background in IE is white,
whereas it's dark grey on chrome. It's readable, but not very legible.
~~~
k1m
I'm sorry about this! I should've checked on IE. Just have and Edge shows
right background colour but chunky font which is difficult to read. And IE 11
shows white background. We used Typora to write this up and used its bundled
Night theme. Assumed it had been tested in IE, but appears not. I'll make sure
we fix this soon.
Edit: It's also possible that we tweaked the CSS it generated, which might
have caused this, so I'd rather not blame it on their theme before I test it.
~~~
baud147258
Considering the IE marketshare, I don't think it's much of a problem to not
test on IE (or even Edge). And I'm less bothered by a browser compatibility
issue than illegible design.
Also on Edge I see the same white background as on IE, but I'm on Windows 10
1703, if it's a version dépendent bug.
~~~
k1m
It appears this was related to CSS variable support.
Typora's themes use CSS3 variables which aren't supported in IE and weren't
supported in Edge until more recently. I tried on Edge 42 (EdgeHTML 17) when I
said the background colour displayed correctly. Perhaps you're using an older
version of Edge (the version is displayed at the bottom of the settings
panel).
In any case, I've added CSS to set the background colour without using CSS
variables. It now shows correctly for me in IE 11 too.
Thanks for letting us know.
| {
"pile_set_name": "HackerNews"
} |
New kernel.org - mrnil
https://www.kernel.org/category/site-news.html
======
kristopolous
Here's what I did on the old kernel.org:
1\. I typed in www.kernel.org
2\. I looked for the new kernel on the page that loaded.
3\. I clicked on the link and my download started.
On the new kernel.org:
1\. I type in www.kernel.org
2\. I look for the new kernel on the page that loads.
3\. I click on the link and my download starts.
Conclusion: Good job. You made the site look more modern without disrupting
the most common workflow.
The only 2 suggestions I have are
1\. The entire container for "Latest Stable Kernel" should be the hit point;
yes, all of it, including those words.
2\. When I hover over the tabular download links, my eye can get confused on
which row I'm on. Something purely css and really subtle would alleviate that;
for instance:
* changing the row background color
* changing the color of the font for the row
* prepending a UL style dot to the LHS of the row
* making the bordertop and borderbottom more distinct
* underlying or italicizing the words
* etc...
any of these things would help and I'm totally agnostic to what is done; it
would be a quick fix that would really make the thing less prone to human
error.
------
SnowLprd
As a member of the Pelican dev team, I think I speak for all of us when I saw
we were thrilled to see such a high-profile site running on Pelican. The folks
at kernel.org were even kind enough to mention it on their new site:
<https://www.kernel.org/pelican.html>
Those of us in the #pelican IRC channel had a rousing cheer and virtual
clinking of the glasses. :^)
If anyone has questions about Pelican, please feel free to ask here or on our
IRC channel.
_Edit:_ And if you're going to PyCon in Santa Clara, CA this month, hit us up
on Twitter (@getpelican) to get info about our pre-PyCon meetup.
~~~
jevinskie
Any guides on making a pure "portfolio" sort of page, no blog whatsoever? All
of the examples that I have seen have a blog component. I am just looking to
make some basic header/nav/content/footer type templates. Thanks for your work
on Pelican! I found it two days ago while looking for a static site generator
and I was excited to see it mentioned in the news.
~~~
SnowLprd
That's certainly been requested and is coming in the next version, hopefully
to be released sometime this month. Thanks to the hard work done by Bruno, one
of the other dev team members, you will soon be able to override pages by
specifying a source file and a destination URL:
<https://github.com/getpelican/pelican/pull/623>
With this in place, you should be able to populate your site with pages (e.g.,
/content/pages/{index.md,about.md,portfolio.md,contact.md,etc}) without any
chronological blog content.
------
tbassetto
I knew I saw this design somewhere else:
[http://coding.smashingmagazine.com/2009/08/04/designing-a-
ht...](http://coding.smashingmagazine.com/2009/08/04/designing-a-
html-5-layout-from-scratch/)
Granted, it's not "complicated" but it's also the same colors...
_Edit:_ Smashing Mag is mentioned on Pelican blog:
<http://blog.getpelican.com>
~~~
jff
At least they give him credit:
<https://www.kernel.org/theme/css/main.css>
~~~
SnowLprd
Pelican's default theme includes explicit thanks, and a link, in the footer of
_every single page_ \-- which is certainly more noticeable than attribution
inside a CSS file.
------
nuclear_eclipse
I appreciate the move from gitweb to cgit; I've encouraged countless others to
make the same transition, and now have a good reference to point at when I do
so.
------
lucb1e
Nice font, kernel.org
<http://i.snag.gy/05FuX.jpg>
I always love websites with custom fonts that don't render correctly. When I
see a website with great typography, most of the time it turns out to be
Georgia or Arial, or sometimes even Timew New Roman (which his a good font,
but simply overused). I've yet to see a custom font that reads better than
correctly applied default fonts.
~~~
mich41
Works for me.
Probably you are using some Linux distribution with misconfigured freetype.
~~~
BUGHUNTER
I knew these kind of answers would pop up, I had one sentence predicting this
in my text, but erased it to let the magic happen.
Are you able to realize that it is, of many options you have, the dumbest
possibility to just burp a simple "it works here". Would you answer this on a
bugreport?
I assume, it was an ironic answer, in this case I congrat you for holding the
mastership of the highest art of subtile trolling, very inspriring.
In fact you are simulating the typical xtreme-dumb answers perfectly, that are
received on "bad-font-whining" regularly - this is exactly the reason why we
need a major bad-font-campaign, because this is not only a technical problem,
but the roots seem to be hidden in a deeper socio-techno-cultural level of our
brains - that "systems are getting too complex"-stuff leads in the end to the
human brain regressing strongly in certain areas to amoeba-state.
~~~
IvarTJ
> Would you answer this on a bug report?
If I was unable reproduce the bug with the information given, it seems
reasonable to mention that. Also, the effort required by the bug reporter to
mention basic information on the setup they use is much less than for a web
developer to boot up 5 different systems (including iOS and Android) to test
out each major browser in them.
I'll have you know that I have been wary of using non-default fonts because of
these reports.
------
Nux
Seems like a common sense design, no complaints, but it's funny to notice how
the wider our screens become, the narrower the sites get.
~~~
wtallis
That's no surprise to me - monitors are so ridiculously wide these days that
it's impossible to ignore the need to limit the length of a line of text. When
you're working at 1680x1050 you can kinda ignore it, but with 1920 or more
horizontal pixels, text needs to be broken up into columns, and you've also
got enough horizontal space to have two windows side-by-side without feeling
cramped.
~~~
lucb1e
That has always surprised me about Wikipedia. So many articles on recommended
line width, height, font, etc., and Wikipedia gets away with just _one_ style
rule (font-family: sans-serif). Never heard any complaints or noticed that it
was bad typography.
Yeah they have more style rules, but disabling them doesn't visibly change
anything so they're the defaults.
------
parfe
I started using pelican recently and I really enjoy static site generation.
It's amazing how little needs to actually execute server side (blog content in
a database?) It's also much easier to focus on writing when I don't have shiny
objects distracting me.
~~~
devicenull
I wish it were a more mainstream thing. It's impossible to compromise a
statically generated website (discounting webserver/OS level vulnerablities).
It's significantly more secure then Wordpress running tons of plugins
~~~
entropie
I was never much of a blogger but i liked tumblogs the old days.
My webspace could not run ruby so i took the approach 3 times to write a
static site generator. For me, i would take that route again. I have my
favourite editor to write stuff and who the hell needs a database for a blog
or something small?
I use the last system i wrote still, but I take notes with it and save images
to a local repository.
------
Hello71
Hm... seems like there's no way to download a tarball in cgit, but I'm not
sure if this functionality was enabled in kernel.org's gitweb either.
~~~
LukeShu
I know that cgit supports downloading tarballs of tags at least, but they seem
to have disabled that. I can't say I blame them, they already host the
official release tarballs, and tag tarballs would only contain the same thing.
------
meej
Well done. I happened to visit kernel.org earlier this week for the first time
in many years and I was surprised at the dated appearance.
------
ramidarigaz
Nice. Looks clean, things are well sorted.
------
ronybc
Nice... Good... Cool... Required. Thanks for keeping the structure same and
putting no 'extra features'.
------
edsiper2
Finally a change... but.. the font looks weird, looking at the CSS:
<https://www.kernel.org/theme/css/main.css>
that style was made on 2009!!!
Please somebody fix that fonts
------
jff
Oh hey now it looks like every other site on the Internet.
| {
"pile_set_name": "HackerNews"
} |
Google Analytics Premium vs. Standard - benblodgett
http://www.google.com/analytics/premium/features.html
======
jonah
For a flat annual fee of $150,000.
Contact sales for more information.
| {
"pile_set_name": "HackerNews"
} |
Responsive Typography - maguay
http://informationarchitects.net/blog/responsive-typography/
======
nicholassmith
This is no criticism, but for an article on responsive type/design seeing the
page being slightly off in terms of fluidity made me smile. Try resizing the
article down as narrowly as you can, for me it brought up horizontal scroll
bars.
EDIT: I think the 'most people can't...' is off slightly because good vs bad
is purely a personal, instinctive thing. Everyone can feel a typeface that
_they_ like, and what works for them, but there's an entire collective of
people who genuinely think Comic Sans is the perfect font for pretty much
everything. Which for them it is, same as the lovers of Helvetica and so.
~~~
batista
> _This is no criticism, but for an article on responsive type/design seeing
> the page being slightly off in terms of fluidity made me smile. Try resizing
> the article down as narrowly as you can, for me it brought up horizontal
> scroll bars._
As it should. Adaptive page design is about adapting to different screens, and
their design works from an iPhone to a 27" Mac.
It's not about never showing scrollbars when you make the screen narrower than
the intended width.
Really, what should they do, show a one-word, long narrow column of text to
avoid showing scrollbars?
~~~
nicholassmith
Good criticism, maybe I was unfair but resizing a browser pane to the width of
a mobile device brought up scrollbars which was unexpected.
~~~
Chris_Newton
Sorry if this is a little off-topic, but I thought I’d mention it in case
anyone’s curious...
The behaviour of CSS3 media queries is a little counter-intuitive (he said,
politely) in this respect. The widths you match against in a media query
explicitly include the width of any scroll bar[1], yet in general you have no
way to control or even determine the width of such a scroll bar. That means if
you write responsive CSS to match a mobile-friendly width of _w_ pixels in a
media query and then style your content to use the entire _w_ pixel width you
might expect to have, you can wind up with a horizontal scroll bar if your
content is more than a screen long and your browser displays a vertical scroll
bar that eats into the available width.
Mobile devices typically work around this by not displaying scroll bars full-
time anyway, but if you resize a typical desktop browser window to the same
size as the screen on a mobile device, you will sometimes get surprising
results. The adjustment can be relevant if you’re designing wide layouts to
take advantage of modern widescreen laptops as well, though at those
resolutions you might be less concerned about leaving a few pixels spare in
the width to allow for a vertical scroll bar that might appear.
[1] See <http://www.w3.org/TR/css3-mediaqueries/#width>
------
mortenjorck
I have an awed respect for iA's unrelenting attention to detail here, but
ultimately, I believe it shouldn't be the responsibility of the type designer
to grade typefaces to different display contexts.
This is a runtime problem. Designers need more control over the antialiasing
algorithms used by the output medium.
~~~
kijin
I don't think it's that simple.
Check out any high-quality font recently released by Adobe, like Garamond
Premier Pro. It comes in different "optical sizes", one for small print, one
for slightly larger print, one for regular body text, one for large display
sizes, etc. This is in addition to light, regular, semibold, and bold weights
for each optical size. Type designers have been doing this since 500 years
ago, and digital types are just beginning to catch up. You can't have one font
that looks exactly the same in all media, all antialiasing algorithms, and all
sizes and resolutions.
Most people don't even notice, but serious type designers care about these
subtle differences. Besides, antialiasing alone won't solve this problem as
long as screens vary greatly in pixels per inch. No matter how good your
antialiasing algorithm is, text at 96ppi is going to look different from the
same text at 326ppi. Just like text printed on coarse paper will look
different from text printed on glossy paper.
~~~
mortenjorck
Optical size variants (the size at which the text appears in your field of
view, and its size relative to other type in a layout) are different from DPI
variants or paper coarseness variants, though.
Paper coarseness variants will generally be designed around the physics of
ink, to trap it in the junctions of small letterforms and prevent ink
bleeding. Optical sizes tend to be more about the relationship between
headline and body, with things like tighter standard kerning, taller
x-heights, and so on. A DPI variant should only concern itself with aliasing,
and aliasing is a render-time issue. You can design a typeface to counter this
from the foundry side, but a more sustainable solution would be to counter it
from the software side.
~~~
kijin
You're right, they're different. I was just pointing out examples where
foundaries produce variations of the same typeface to account for differences
between media.
Anyway, as long as such differences exist, I don't think we'll be able to
convince type designers to disregard them. They want their fonts to look
perfect _now_ , they're not going to wait for smartphone manufacturers to
catch up, and they already have the skills and tools to produce subtle
variations of their typefaces.
A truly sustainable solution would be for every reading device to have 300ppi+
screens so that it doesn't really matter how you shade the last subpixel.
Until we have that, leaving optimization to perfectionist type designers seems
to me like a better stopgap measure than trying to standardize on one or
another antialiasing algorithm across vastly different platforms.
Different vendors like Microsoft and Apple use antialiasing algorithms that
are not only technically different but also based on different UI paradigms.
(MS emphasizes crispness whereas Apple emphasizes preserving shapes.) Neither
is obviously superior to the other, and the choice is to some extent a matter
of taste. So it should be the designer's prerogative to decide whether to
follow either paradigm or to disregard both and enforce his own aesthetic
tastes.
------
fusiongyro
I think we may have too many senses for the word "design" floating around
because this remark reads like an oxymoron: "This is not a design deficiency
of the typeface. It was simply not designed to work as body text for big body
text sizes and dense screens."
I like the idea of "grades" because it seems abstracted somewhat from the
device's details, but the fact that Retina displays are different in portrait
and landscape mode is worrisome. I wonder how this concept is going to scale
as new devices with weirder resolutions come out.
~~~
sp332
"This is not a design deficiency of the typeface. It was simply not designed
to work as an end table."
~~~
fusiongyro
I think the two senses of "design" are confusing the issue. If he's absolving
the typographer for not accounting for retina displays, fine, but it sounds to
me like he's absolving the typographer for making a poor decision with an
after-the-fact assertion that it was intentional. I don't think intending for
something to suck is an adequate excuse.
~~~
sp332
Title fonts can be a lot weirder than body fonts. If tweaking it to look good
at smaller sizes made it look crappy at large sizes, that was an acceptable
tradeoff at the time.
~~~
fusiongyro
Modern fonts, particularly at least OpenType, can embed different information
for different sizes pretty easily. If this problem could be solved by using a
more modern font format and better quality fonts, that seems like a better
approach.
------
juiceandjuice
Resolution on a laptop screen doesn't mean shit, what you need is effective
pixel density and that's going to vary too much. I don't know of any browser
that communicates "hey I'm a 15" screen and my resolution is AAAAxBBBB and my
user sits CC" away from their screen on average" Of course with an iPhone/iPad
you can do a bit more generalization but not for laptop screens.
This is the premature optimization of design.
~~~
JoelSutherland
If you read the article you'll see that they're only sending their custom font
to OSX. PCs get Georgia. On OSX, resolution is enough to determine pixel
density at the granularity they need because of the limited number of products
Apple has [1].
So they have successfully achieved their goal.
It's not a goal I would set out to achieve, but to each their own.
1\. Notice that both Air models have high(ish) DPI resolutions that are unique
in the history of Apple products.
~~~
juiceandjuice
Except my 15" macbook has the same resolutions as 17" macbook from 2008 and
the 20" Cinema display and 20" iMac.
The current 17" macbook has the same resolution as the 23" Apple Cinema HD
display and 24" iMac.
What happens to my mac mini which is running on my 23" 1920x1080 NEC
Multisync? What about when I hook it up to my TV? Will it think it's the 21.5"
iMac?
What about my dual monitor setup on my Macbook pro (15" 1680x1050 & 24"
1920x1080)?
What about everyone else using some variety of monitor setup with their mac?
Maybe they could try a "Optimized for viewing on Apple hardwear (and only
apple hardwear) made after 20XX and at a distance XX" except for these maybe
these models:" disclaimer.
~~~
falling
Dude. Way too much pedantry. They are experimenting. If they get it right on
your screen you get a more precise rendering of the font, if they don’t you
get a slightly suboptimal rendering, but that’s the same result you'd get with
a generic font.
------
kickingvegas
An old saw on the topic of scan conversion in computer graphics, but I'll
reiterate. The computer industry needs to use conventions that render graphics
using physical dimensions. A 12pt typeface should be the same physical size
regardless of the size and dpi of the display.
I refuse to believe the horse is out of the barn on this one.
------
cabirum
That's just wrong. Pick a font more or less suitable for all the devices and
use it. New ipads and iphones will be out in a year, then what? Throw in
another couple of fonts? Any project larger than one-page personal blog is
destined to turn into nightmare.
Also, add some fonts for android devices, why not? Pentile displays should get
their optimized fonts as well.
------
kibwen
I realize how bizarre this must sound, but my first reaction upon visiting
that page was that the font was just _too large_ to read comfortably. It only
takes a quick ^- ^- to remedy, but it's an amusing counterpoint to all the
recent frustrations with sites that insist on using 10pt fonts.
~~~
voyou
Yeah, I've noticed this on a couple of sites too, particularly the default
Octopress theme which (like this site) sets the text size on the body to a bit
over 1em, and I don't understand why anyone would do this; what platforms are
they developing for, where the default font size is too small? Obviously,
better to have the fonts too large than too small, though.
~~~
ricardobeat
You are just used to very small type.
<http://informationarchitects.jp/blog/100e2r/>
~~~
voyou
No, I'm used to 1em being a reasonable size for body text - the text at that
link says as much, but for some reason sets its own body text to 1.3 times
browser default. I don't particularly object to it, I'm just curious as to why
they think the browser default is too small.
------
callmevlad
"... they look too bold on high resolution screens (MacBook Air) and they look
okay on Retina displays (next MacBook Pro)"
I wonder if this is assumption based on rumors and extrapolating iPad 3
display behavior, or these guys actually got their hands on the next-gen
MacBook Pro.
------
chj
Love the website design.
I really appreciate their attention to details. iA writer is a good example.
Simple and flawless, however, I cannot recommend it to hackers since the
functionalities are quite limited. Sometimes I use it for taking quick notes
and that is it.
~~~
slantyyz
>> Simple and flawless, however, I cannot recommend it to hackers since the
functionalities are quite limited. Sometimes I use it for taking quick notes
and that is it.
The "flawless" sounds a bit contradictory seeing how you follow up the
statement with some arguable flaws.
I've always thought iA writer's simplicity was a gimmick. The whole notion of
"distraction free writing" as a quick cure for the user's inability to focus
is a hard promise to deliver on.
~~~
chj
well, I do not count missing features as flaws. what I mean by flawless is
that iA writer does well on what it is supposed to do.
------
nfomon
That "N minutes left" div popup is so annoying that I stopped reading the
article.
------
tferris
Responsive typography is no solution—the client side has to improve and it is
improving already:
Just use -webkit-font-smoothing: antialiased (which iA totally missed in their
article) and fonts look again right in Chrome and Safari
------
generateui
A typographer, talking about screen density, and clarity on his blog using a
serif font. Oh, the irony.
~~~
epo
Do you have a point to make or is is this a display of faux-superiority from
someone who actually hasn't got anything to say or any way to back up their
insinuations? And you don't know what irony means either.
------
ricardobeat
Barely related: I loved the old website so much. RIP informationarchitects.jp
;(
| {
"pile_set_name": "HackerNews"
} |
When Your Customer is Your Competitor: The Return of Roll Your Own - andrewvc
http://redmonk.com/sogrady/2010/01/12/roll-your-own/
======
grumpycanuck
I know it seems obvious, but determining when to create your own tech instead
of using or extending existing solutions is tough. Even tougher given the
tendency of people to think their situations are unique instead of being "just
like situation X with Y added."
I mean, how many times (to beat a dead horse) have you seen someone say "I
wrote my own framework because <insert contrived scenario that other
frameworks have solved or are easily extended to solve>".
Yes, sometimes your situation is so unique you need to role your own solution.
Chances are it's your ego talking instead of common sense.
~~~
andrewvc
True, but then there are situations like Rails, where 37 Signals could have
built their apps using existing tech, but said 'there should be a better way'.
I mean, if you told someone "I'm building this project management app, and I'm
going to take an esoteric language no one knows, and write an app framework
from scratch" they'd rightly tell you you're crazy.
Most endeavours that start like that end up as failures, however without these
attempts tech would stagnate.
~~~
fnid
If one _were_ to set out on such a path and accomplish the goal out of hobby
or desire to learn a new language, or for whatever reason, how long would
something like that take? How long did it take 37s to build ROR? How long did
PHP take?
I'm just curious...
| {
"pile_set_name": "HackerNews"
} |
Git Repo of Police Brutality During the 2020 George Floyd Protests - Glench
https://github.com/2020PB/police-brutality#hn
======
tedunangst
If you think there's missing context, submit a pull request with a link?
Here in philly, police insisted they tear gassed protestors because they were
in imminent danger. Two weeks later, they failed to find any evidence to
support that, despite many camera angles. Finally issued an apology. Turns out
the other side of the story was actually the same as the protestors' side;
they just didn't want to admit it.
~~~
downandout
_Missing_ context implies that their failure to provide it was an oversight.
There was not even an attempt at providing context here, because doing so
would be unpopular.
~~~
klyrs
With cops wearing body cams, that evidence should be abundant. Its absence
_is_ telling.
Collecting evidence is pretty much job #1 for the police. Without that, they
can't establish guilt of the accused.
When dozens of cops were involved in a beatdown and not a single one can
produce a shred of evidence, we assume that the protesters were innocent.
~~~
BLKNSLVR
> Collecting evidence is pretty much job #1 for the police.
Excellent point. If denial / lack of evidence is the SoP for any Police
Department, what are they even doing? Defense Lawyers would barely have to
work for their pay.
------
throwvid19
The number of comments in here that ask for additional/richer data and are
being flagged is supremely concerning.
I understand that this is a hot issue with very polarized sides, but in what
kinds of circumstances is having more data _bad_ , except to intentionally
support bias?
Please understand that my question is _only_ to understand why, in the context
of accumulating data, is trying to obtain more/better data NOT a good thing?
I do not wish to debate what is already being said in the many comments
already, so if it helps to change the context of the data in question in order
to discuss, that sounds like a good idea to me.
~~~
SauciestGNU
I think context is good and we should be wary of videos that have been edited
to remove critical information.
However, there is such a large corpus of evidence against the police in these
protests, I have to wonder if the people asking for more context are doing so
in good faith, or rather to argue for the innocence of the police. Having been
personally on the receiving end of police violence during these protests, it
bothers me that anyone could look at this wealth of videos and see anything
other than a clear pattern of institutional violence being wielded against
those who are in opposition to just such violence.
~~~
systemvoltage
One thing that comes to my mind (note: I do not support this view) but I am
trying to put myself into the opposite party's shoes:
\- Listing videos of police brutality during protests without also listing
videos of protestors brutally attacking the police perhaps creates a
dissonance to the counter party?
I think there have been a few incidents were cops were attacked but they are
far and few between, but listing those would help clear the accussation of
hypocrisy.
Furthermore, I personally think that we should separate police brutality
videos in normal civic life (before the protests began) to gather evidence of
systemic violence vs. the enraged/emotionally outraged protests that both
sides were not willing to concede. I categorize them as different.
~~~
SauciestGNU
I really doubt there's more than a handful of instances in which protesters
use violence against the police in a context that's not self-defense.
------
ilikehurdles
Well organized and useful resource.
Edit: Oh nice, there are backups of videos as well, and there's even an ipfs
source.
------
Glench
Resubmitted as per this comment from a moderator:
[https://news.ycombinator.com/item?id=23401418](https://news.ycombinator.com/item?id=23401418)
~~~
novia
thanks for getting this out there
------
enriquto
There is a similar site for the spanish violence against the catalan
referendum of 2017:
[https://spanishpolice.github.io/](https://spanishpolice.github.io/)
In that case, unfortunately, the videos are only stored on youtube and
twitter, so you cannot backup them easily just by cloning the repo.
~~~
LyalinDotCom
I really worry about videos being lost, almost feels like someone should do an
archive every so often and put it up as torrents that many people can help
backup. Just a thought
~~~
trickstra
[https://archivebox.io/](https://archivebox.io/)
------
thelock85
These contributors should partner with [https://raheem.ai](https://raheem.ai)
which is trying to make it easier and less intimidating to report instances of
police misconduct.
------
downandout
As a person that makes decisions based on data, rather than emotions, it would
be interesting to see a comparison of the volume of such incidents on both
sides. Is there a similar repo of the looting and violence committed by
protesters? I think that would give a more accurate picture of all of this. We
would then know if the police brutality incidents paled in comparison to
violence/looting by the public, or vice versa.
~~~
triceratops
> Is there a similar repo of the looting and violence committed by protesters?
Does it matter? They're being investigated and prosecuted regardless[1],
because they're criminals. We expect police to behave _better than criminals_.
Society is always going to have some crime, and we will continue to
investigate and prosecute it. But there should be zero tolerance for crimes by
cops. Otherwise we can't trust them, and everyone becomes less safe.
1\. [https://www.theverge.com/2020/6/18/21295301/philadelphia-
pro...](https://www.theverge.com/2020/6/18/21295301/philadelphia-protester-
arson-identified-social-media-etsy-instagram-linkedin)
~~~
downandout
_Does it matter?_
Where crimes, such as burglary and assault, are being committed, police will
generally be called upon to act. A subset of those responses will involve
police violence. One would hope that such police violence only occurs in self
defense, but we know that to not always be the case. So yes, the two things
are mathematically correlated. One drives the other. It would be interesting
to be able to tell what the incidence rate is, rather than just seeing a group
of cherry-picked incidents designed with the sole intent of satisfying the
public's appetite for a specific narrative.
~~~
Uehreka
It’s one thing to say “given there will be crime, and some criminals are
violent, cops may have to be violent sometimes to stop criminals”. That’s a
pretty bloodless analysis.
Let’s get to the fun part: “if cops perceive someone to be a criminal, how
violent will we allow them to be, relative to how violent the perceived
criminal is?”
In the case of the cops assaulting that vigil for Elijah McClain (it was a
peaceful gathering with people listening to a couple violin players) I’d argue
that the cops were perpetrating violence at a level about 100x relative to the
people they were trying to police (even if you consider that folks may have
been “yelling mean things” at them).
Personally, I would consider 1.1x acceptable, maybe 1.5x in some situations.
And so every time I see cops blowing through that threshold and being even 5x
or 10x more violent than the people they’re policing, I perceive that as a
massive injustice, I think those cops should get fired, and I get pretty
angry.
~~~
SpelingBeeChamp
Serious question: do you think that police officers should have to accept
themselves to getting hurt as part of their job? Meaning, that it's actually
an expectation the public has of police officers that they will get hurt if
someone is fighting with them. To me, that seems like the inevitable
consequence of not quickly taking control using decisive force. That's the
issue I think exists with 1:1.1 response. It doesn't stop the unlawful
behavior. No?
~~~
Uehreka
We can talk about that when they get down to like, 3x to 5x force.
Right now we are seeing video after video of cops tear gassing civilians for
throwing empty water bottles at them, running over civilians using SUVs for
"refusing to disperse" and knocking out old folks who are going "Hey man, what
if this is a bad idea though?".
The ratio right now is so high, the kinds of conversations you may want to
have aren't even relevant.
------
forgotmypw17
Every link in that repo needs to also be archived as a video file into that
repo, to protect against link rot.
~~~
TouchyJoe
how can this amount of video footage be backed up?
~~~
forgotmypw17
Is it really that much?
------
heavyset_go
Is there a torrent going?
------
zaroth
I think at the very least the title should be Git Repo of _Alleged_ Police
Brutality During the 2020 George Floyd Protests.
Like another commenter downthread, I spent some time looking through incident
reports in my own State, and found some of them to either be simply
unsubstantiated claims, or videos of police reasonably appearing to be doing
the very dangerous job of managing a riot.
There are valid concerns of the validity and objectivity of many of these
alleged incident reports. By commingling the few truly reprehensible actions
with objectively necessary riot control, the site loses its objectivity and
fairness, and does harm to its intended goal. In Reddit parlance, it becomes a
circle jerk.
I think the fundamental problem with many of these incidents is that the
provocation of rioting has been used, not in all cases but certainly in many,
to incite a police response which is then filmed and condemned as if
unprovoked or unnecessary.
The vast majority of police are heroes. And every community depends on their
police to provide a mission critical service. The ability of police to provide
this service in cities like Minneapolis and NYC has been dramatically
curtailed by a violent political uprising which has damaged billions of
dollars of property and driven homicide rates up nearly 100% year over year.
IMO the fatal flaw in the response to protests against police is to send
police to stand in front of them and take the abuse. You don’t put protesters
and counter-protests next to each other to face off. It seems to me
particularly unfair and indignant to have black police officers on the line
being screamed at by white protesters about Black Lives Matter, to stand
silently while they are verbally denigrated and abused for doing their job,
which can lawfully include using force to disperse a riot. If the target of a
large protest is the police themselves, the National Guard should be called in
to provide crowd control and defend life and property if needed. This has the
double benefit of not providing a standing target for the protesters ire, and
not creating a self-fulfilling prophecy / feedback loop of aggression.
~~~
triceratops
> The vast majority of police are heroes.
Define "hero". As has been noted many, many times across HN these past few
weeks, "police officer" isn't even among the top 15 most dangerous professions
in America. And even then police aren't legally required to do their job[1].
Most police are never called upon to do anything heroic, so we can't say if
they are heroes or not.
This type of blind veneration for police is weird and unhealthy. If you tell a
child, or even an adult, they are great and perfect just for who they are,
they're going to turn out spoiled rotten. Why do you think this is any
different?
Police are regular people. Some good, some bad, but mostly mixed. The systems
they work in allow the bad police to get away with horrific crimes and punish
the good police who try to do anything about it.
1\. [https://www.nytimes.com/2005/06/28/politics/justices-rule-
po...](https://www.nytimes.com/2005/06/28/politics/justices-rule-police-do-
not-have-a-constitutional-duty-to-protect.html)
~~~
zaroth
USA Today says that policing is the 18th most dangerous job in the US.
What makes police injuries and deaths notable from other occupations is that
“The most common cause of workplace fatalities among police officers is direct
violence from other people”.
It is not just a risky job in general, it is a risky job due to people that
would intentionally do them harm, done in service of their community. Everyone
is a regular person, but that is why people generally regard police as heroic.
In fact, police are _expected_ to act heroically when the situation calls for
it. The Parkland officer who didn’t enter the school during the shooting,
caught on tape remaining outside while shots were fired, was rightly condemned
as not fit for duty.
[1] - [https://www.usatoday.com/story/money/2019/01/08/most-
dangero...](https://www.usatoday.com/story/money/2019/01/08/most-dangerous-
jobs-us-where-fatal-injuries-happen-most-often/38832907/)
~~~
triceratops
> The Parkland officer who didn’t enter the school during the shooting...was
> rightly condemned as not fit for duty.
But then got his job and pension back with the full backing of his
union[1][2]. This is what I mean by the system propping up bad cops. If the
"vast majority" of cops are heroes, why are they supporting someone who has
demonstrated a distinct lack of heroism? Isn't he making the rest of them look
bad? Don't they care?
1\.
[https://www.usatoday.com/story/news/factcheck/2020/05/15/fac...](https://www.usatoday.com/story/news/factcheck/2020/05/15/fact-
check-parkland-officer-who-failed-act-shooting-gets-job-back/5194831002/)
2\.
[https://www.miamiherald.com/news/local/community/broward/art...](https://www.miamiherald.com/news/local/community/broward/article242719216.html)
~~~
monocasa
He also got back pay, with back overtime.
~~~
zaroth
Yeah, I never said I was pro-union, but there was certainly widespread
condemnation of that officer’s response.
~~~
triceratops
So basically no real consequences.
I'm sympathetic to people losing their nerve. It's a human reaction and can
happen to anyone. But no way should he be allowed to keep his job after he's
shown how unfit for it he is.
And the fact that the other cops didn't raise a peep over it tells me that
most cops are in fact not heroes. Heroes do the right thing even when it's
hard, you see. That's not an insult. I would like a society where policing
doesn't have to be heroic. It's just a statement of fact.
------
TheGrim-888
Just a reminder that none of this is factual or evidence. It COULD be, but
just because somebody writes something doesn't make it true.
You are not given any of the context of what happened in these situations. If
there's video the video starts when the conflict is already full swing, you're
not seeing anything that led up to the situation.
You can read "Police showed up and fired tear gas and agitated the crowd and
caused violence", but you're not told WHY they fired tear gas. You're not
given the context as to what happened.
Not only are you not given the context, but you're only given one side of the
story. I'm sure if you asked the police what happened, their story would be
completely different. But you're only being allowed to get the story from one
side, and that side is insanely politically motivated to exaggerate, and do
everything they can to make the police look bad, because it strengths their
political positions.
~~~
isbjorn16
> Not only are you not given the context, but you're only given one side of
> the story. I'm sure if you asked the police what happened, their story would
> be completely different. But you're only being allowed to get the story from
> one side, and that side is insanely politically motivated to exaggerate, and
> do everything they can to make the police look bad, because it strengths
> their political positions.
Is that... is that not exactly, precisely, what we should be expecting from
the police side as well? I think that's the entire point, isn't it? At best,
their credibility has been called into question. At worst, their credibility
has been drug out into the street and kneeled on until it expired.
~~~
rootusrootus
I recall reading that some localities have already made changes to their body
camera recording rules which rule video footage inadmissible if it has been
edited to remove context.
~~~
SpelingBeeChamp
I don't buy it. It is unlikely that the rules of evidence are being changed. I
could always be wrong, though. Got a source?
~~~
rootusrootus
Sorry, I was a little unclear in that post and probably made it sound more
legal (as in, courtroom) than I intended. What I meant is that some police
departments have announced that they will not allow edited or out-of-context
footage from body cams by police officers facing administrative action. You're
right in that courtroom rules of evidence are not subject to such decrees.
------
mjcohen
Gonna need terabytes.
~~~
ImaCake
I'm not sure if you are serious but even millions of events recorded in text
would not take up too much space. Bioinformatics datasets that run into the
millions of lines still only require a few gigabytes, and they are often
compressed into tarballs which can halve their size.
Moore's law is on the side of the protesters here, this information can spread
_easily_ because it does not require much bandwidth (without video) for anyone
to `git pull remote` this whole repository.
I hope that helps explain why some people have downvoted you without
explanation (for the record, I did not).
------
schaefer
Las Vegas: \- the police shot a man. \- a man shot the police.
neither incident appear in this data set.
------
overfl0w
Now all we need is a similar repo for the looting and memorial vandalism.
~~~
goto11
Wouldn't that be a job for the police? Their job is (lest we forget it) to
prevent crime and enforce the law.
~~~
overfl0w
Sure but they are not in a public git repo and I think the public has the
right to see the full story. The police are the easiest target right now but
those people who take advantage on the riots to loot are kind of left out of
sight.
------
Koshkin
[https://en.wikipedia.org/wiki/I'm_Gonna_Git_You_Sucka](https://en.wikipedia.org/wiki/I'm_Gonna_Git_You_Sucka)
------
mothsonasloth
The road to hell is paved with good intentions.
I will watch this repository, but I think it will be another showman piece
which will join the many other repositories that are forgotten about and left
to go stale on Github.
Why? Well if you look at some of the contributors, they all have new accounts
generated around June, and some have a history of creating fad'ish
repositories, probably to massage their ego online.
This repository has some good content but also a lot of "weak" to "non-
existent" examples.
~~~
cvlasdkv
Ultimately there is enough evidence to make a decision already--the case of
police brutality is nothing as invisible or insidious as something like the
sexual assault or harassment revealed by the #MeToo movement. I am not sure
this repository accomplishes anything, especially as reports do not seem to be
well vetted, but it would be nice if cleaned up to have a list of examples.
------
readme
The problem here is that police are not infallible but everyone wants them to
be. If there is violence in humanity you would be naive to think police would
be exempted from it. That is why I don't like coverage like this: why not also
catalog some of the good things police do? That wouldn't further your point,
so why would you do it?
~~~
seth_tr
The police don't acknowledge that these were mistake[1][2][3]. They think this
is the system working. They act in bad faith and so please don't run around
grant them good faith in arguments.
We want the systems that enable this brutality changed.
This is like saying "Facebook does some good things so let's ignore the
systematic problems it causes" or "Facebook is staffed by people so we have to
accept an amount of them stalking their exes with internal tools"
[1] [https://www.nytimes.com/2020/06/05/us/buffalo-police-
shove-p...](https://www.nytimes.com/2020/06/05/us/buffalo-police-shove-
protester-unrest.html) "Fifty-seven officers resigned from the department’s
Emergency Response Team in solidarity with the two who were suspended." [2]
[https://www.dailymail.co.uk/news/article-8436929/Atlanta-
cop...](https://www.dailymail.co.uk/news/article-8436929/Atlanta-cops-no-work-
today-city-claims-handle-911-calls.html) [3]
[https://www.cnn.com/2020/06/13/us/why-police-rally-around-
ea...](https://www.cnn.com/2020/06/13/us/why-police-rally-around-each-other-
trnd/index.html)
~~~
readme
Why not go find me three links to instances where the police did acknowledge
their mistakes?
~~~
seth_tr
I searched google for "police acknowledgement mistake" and several variations.
The closest I could find was people calling for the police to acknowledge
their mistakes. I can't find the police doing it themselves.
~~~
readme
thanks for the due dilligence
edit:
[https://www.google.com/search?q=%22police%20admit%20to%22](https://www.google.com/search?q=%22police%20admit%20to%22)
170,000 results
~~~
ZeikJT
I think a better link is probably:
[https://www.google.com/search?q=%22police+admit+to%22&tbs=qd...](https://www.google.com/search?q=%22police+admit+to%22&tbs=qdr:m)
Though it doesn't give a number of results explicitly there are still several
pages of results (though they'd need to be deduped)
------
throwawaysea
From what I saw on Twitter, most of the videos alleging police using force
without reason were misleading, and leaving out all the footage preceding
police action. In cases where others uploaded longer videos or different
angles, it was clear that protesters were acting illegally, or ignoring clear
verbal warnings, or refusing lawful orders to disperse, and so on. By leaving
those crucial additional bits out, activists were generating outrage online
where none was deserved. The same applies here, and I encourage people to take
resources like this with a grain of salt.
~~~
spaetzleesser
I don't like this US attitude where people think that once police has given an
order to do something you have to comply immediately or you can be shot or run
over no matter the situation. I just saw a video where protesters were
crowding a police and suddenly the car sped up and ran over people because the
cops felt "threatened". Thats just not OK.
~~~
noahtallen
This is key. For the sake of argument, let’s assume that unarmed protestors
are being very aggressive towards police, even ignoring orders to disperse or
follow curfew. Perhaps they are even verbally threatening police!
Should this give police authority to use violent force to retaliate? (E.g.
pepper spray, rubber bullets, fists, etc)
In my opinion, no. I realize that legally, police are protected in these
scenarios. Hell, most folks understand that. That’s what people are protesting
for: they want the legal system to change.
As a society, I think we morally accept using force in self defense (for
example, using pepper spray on an assaulter). We accept it even if someone’s
life isn’t actually in danger. Should we accept the same from police? I doubt
it. We should hold the protectors of the law to higher standards than common
citizens, right? I myself might not fully understand the law and will be
acting on my emotional response if I use force in self defense. But police
officers should have the training to understand much better when lives are
actually in danger. They should have techniques to handle situations like this
without violence.
These videos show that whether or not the police were acting out of self
defense or within the boundaries of the law, they are still using serious
force on unarmed citizens. This is not ok whether or not the citizens are in
the right.
The root of the problem is the police force. Even if people are acting in a
aggressive, rebellious way against police, they are doing so because of
decades of improper use of police force. If we want to change the situation,
people must feel safe around police. It’s not just “were police legally
right,” it’s “do people feel safe in their own communities.” Clearly not, and
we must make changes to help people feel safe. To aid with that, police should
be much less powerful, since abuse stems from power.
We should all get behind these changes because police brutality is dangerous
to everyone, and especially dangerous to minority communities.
| {
"pile_set_name": "HackerNews"
} |
Part of Ham Radio AMPRnet IPv4 Space Sold to Amazon - benburwell
https://www.ampr.org/amprnet/
======
cereal_console
A response I saw on the 44net mailing list:
On 7/18/19 10:57 PM, Majdi S. Abbas wrote:
What's interesting about this is it was not an ARIN allocation,
and the ARDC folks are not the original registrant. This IANA /8 was
initially delegated to a community, not an organization.
So, to the individuals listed in the blog, that I've excerpted
below, what do you have to say about this?
Brian Kantor
kc claffy
Phil Karn
Paul Vixie
I find it interesting how AMPR can sell these IPs vs leasing them.
The community (including myself) only learned of the sale when 44/8 reverse
DNS lookups returned NXDOMAIN today. There was no consultation with the
community. The decision was made in private by the four listed above.
~~~
zifnab06
It actually predates that even - rfc790 defines it as AMPRNET as part of
ARPANET in 1981.
I spent a bit of time tonight trying to find the old pre-ARIN iana/isif Usenet
group and failed. I'd imagine a list that predates ARIN would have some
interesting history around this. ARIN's history has the net block assigned by
IANA in 1992-07, but their records only show it in existence from 1996.
Additionally, ARDC (the nonprofit that currently owns 44/8) was only
registered as such in 2011.
I'll keep looking when I have time.
[1] [https://tools.ietf.org/html/rfc790](https://tools.ietf.org/html/rfc790)
~~~
cereal_console
Great work on this! NANOG has a good thread going (as one can imagine...).
Good post here:
[https://mailman.nanog.org/pipermail/nanog/2019-July/102131.h...](https://mailman.nanog.org/pipermail/nanog/2019-July/102131.html)
------
kawfey
Amazon, or "A very big company with a significant internet presence" like it
states on the page?
Nothing of value was lost to ham radio, as no ham or ham organization indeed
ever used or would ever use that much address space. I guarantee as this makes
the rounds on QRZ a dozen OM's will speak up in fiery, yet misplaced passion,
but really, this is a much better use of the addresses and gives the amateur
community in general an absolutely massive boost.
~~~
tlrobinson
I’m more concerned about the lack of transparency and where the proceeds from
selling off a public resource are going.
| {
"pile_set_name": "HackerNews"
} |
Next.js 7 - lxe
https://nextjs.org/blog/next-7
======
ldthorne
Time and time again, the Next.js (and Zeit) team blow me away. The rate at
which they release new, high-quality versions of their products is astounding,
especially given how few people are working on this full time -- it's
basically just Tim Neutkens, who's insane: he's constantly pushing updates,
reviewing PRs, and finds time to be very active on the support channel.
We use a bunch of Zeit's tools (particularly Next.js and its static export
feature) at Common ([https://www.common.com](https://www.common.com) \-- we're
hiring!), and they're consistently such good and reliable products (doesn't
hurt that they have a sharp focus on developer experience).
Congrats to Tim and to the entire Zeit team!
~~~
subpixel
> it's basically just Tim Neutkens, who's insane
when looking at tools to build a business on, that can be a liability no?
~~~
ldthorne
Fair, but my confidence in it stems from the giant community around Next.js.
Zeit, a company with a good reputation among developers, works on it full
time. It has contributions from other big-name OSS contributors and big
companies (Netflix, Hulu, Nike, Ticketmaster, Jet, Auth0, Marvel, etc.). It's
definitely a liability if it were truly just one individual developing it, but
I suppose it's more that there's a small core of facilitators, with a giant
OSS community around it.
~~~
timneutkens
As said elsewhere, I mostly act as a curator of a very large effort of
hundreds of individuals. The project was actually co-created by several other
people who work on infrastructure at ZEIT (e.g..: Naoyuki Kanezawa led all the
initial releases, designed important parts of the framework, Arunoda
Susiripala contributed the test harness and key build pipeline infrastructure,
and so on).
------
wiradikusuma
Could anyone explain what Next.js is? I use React (using create-react-app),
but I'm under impression that React is a JavaScript framework (like Spring is
Java framework, or RoR is Ruby framework).
So Next.js is a framework for a framework?
~~~
tvalentius
React is more of a library instead of a framework and it's just the view
layer. Next.js is a framework(minimalist) for React.
[https://github.com/zeit/next.js/](https://github.com/zeit/next.js/)
~~~
zerr
And what would be the "maximalist"/rich framework for React? (with lots of
pre-made RAD components I guess).
------
asselinpaul
What's the closest framework for Vue.js and how close is it in terms of
functionality?
Asking because Next.js has been one of the reasons I've become more interested
in the React ecosystem.
~~~
cutety
As others have mentioned, Nuxt.js is what you’re looking for. And, it’s pretty
great to work with. Prior to using it, I just got “done” building out my
significant project with vue, just plain vue. When I started this next one, I
decided to give Nuxt a shot, and it has been like the missing piece I was
looking for in the previous app.
It provides structure to you app, gives you everything you need to build out a
front end out of the box (router, vuex, build system, etc). And deploying it
is a breeze, giving you the option for static html builds, or a SPA, or you
can go full blown SSR, with the best part being you can switch between them
whenever seamlessly if your needs change. It provides enough opinion that your
project has a consistent structure, but doesn’t get in the way if you need to
so,etching outside of norm, extending and configuring it is a breeze. And to
top it off, the docs are great.
Definitely would recommend it to anyone looking to build any sort of non
trivial front end with Vue. I know my next few projects will certainly be
using it.
~~~
thsowers
Thanks for sharing your experiences, could you recommend any resources for
plugging data into Nuxt? This is what I've always been curious about coming
from Meteor
~~~
cutety
I can’t think of any particularly good articles, or things of that nature.
But, you’ll mostly be utilizing Vuex to store, mutate, & fetch data. If you
aren’t already familiar with it, the docs are great [1]. It’s fairly simple to
use, and as your app grows you’ll likely find yourself writing more and more
boiler plate store code, but fortunately, there are a ton of really great
plugins/utilities the community has built for Vuex that help eliminate a lot
of that boiler plate, and make it easier to work with an ever growing store.
The Awesome Vue repo [2] has a pretty comprehensive list, and you can probably
get some ideas of how you’d like to work with it from seeing all the options,
for example Vuex-persistedstate & vuejs-storage allow you to interface with
localstorage, while Vuex-pathify provides a nice, less verbose interface for
actually working with the data in the store, and then there are things like
vuex-api and vuex-rest-api that eliminate a lot of boilerplates code for
interacting with apis to fetch/update data.
[1] [https://vuex.vuejs.org/](https://vuex.vuejs.org/)
[2] [https://github.com/vuejs/awesome-
vue/blob/master/README.md#v...](https://github.com/vuejs/awesome-
vue/blob/master/README.md#vuex-utilities)
------
erokar
Not clear to me from the release notes if there now is an easy way to proxy to
a server in development (if you for instance have a Django driven API on port
8000). This is easily done with create-react-app by just adding a line in
package.json.
In Next you've had to either set up and configure nginx locally or proxia via
an Express server, both options that that are a bit of a hassle to set up for
such a common case.
~~~
timneutkens
create-react-app is actually removing this feature in an upcoming release. We
don't have plans to include it as a feature to Next.js as you can already
implement it using a custom server. We do have an example for it:
[https://github.com/zeit/next.js/tree/canary/examples/with-
cu...](https://github.com/zeit/next.js/tree/canary/examples/with-custom-
reverse-proxy)
~~~
thrower123
Why is it being removed?
~~~
pests
To let an external express server handle it. Which they think will be better
than the current in house solution.
------
deltron3030
Which backend solution that isn't a service/CMS, but is more streamlined than
Express or Koa would you recommend for solo devs going with Next.js? Is there
something like a preferred full stack solution involving next.js on the
frontend?
~~~
timneutkens
Next.js is not prescriptive about your backend stack, it gives you a way to
fetch data using `getInitialProps`. At ZEIT most of our APIs are built using
Micro: github.com/zeit/micro, which is a very lightweight way to build APIs.
------
sakarisson
I really like the styled-jsx changes. One major issue I had was the fact that
it's been difficult to pass styles onto child components without leaking them
through the entire application. I'm really glad to see that they fixed this
issue. Great job, Next.js team! I'm really blown away by Zeit in general.
------
machiaweliczny
The speed improvements calculations seem weird to me. Shouldn't one build done
in 178s instead of 304s be 70% faster instead of 42% faster? (you can do 1.7
of build using the old amount of time)
~~~
dcherman
What you're saying is that comparing the existing build time to the new one,
the existing one is about 70% slower. You can also phrase it as comparing the
new build time to the existing one, it's about 42% faster.
Both are true statements, however people generally like to discuss how much
improved the new releases are versus how much worse the old ones are.
Fun fact, it's important to understand that relationship in other areas as
well like the stock market; if your holdings lose 50% of their value, they
then need to increase by 100% for you to be back neutral :)
------
criveros
so with Next for SSR you use React as a templating engine and just send that
back?
~~~
sakarisson
Sort of. Next expects to find a file structure of scripts in the /pages
directory. On the initial load of say www.nextsite.com/about/location, Next
will by default try to render the file /pages/about/location.js or
/location/index.js.
You can add an asynchronous function called .getInitialProps, which should
return some props that are required for the page. On the initial load, Next
will first run through getInitialProps, then render the actual component with
those props. From then on, assuming you're using 'next/link' for routing,
everything should work like a single page app. All of the pages except for the
initial one are rendered on the client.
| {
"pile_set_name": "HackerNews"
} |
TLS 1.2 Support Added to Chromium - QUFB
https://chromiumcodereview.appspot.com/14772023
======
ctz
Anyone one know if GCM is coming soon?
~~~
wtbob
It's a little odd to me that so few folks support GCM. It's not that difficult
to implement (I've done so, in my free time), it's patent unencumbered and
it's extremely secure. So why not use it?
| {
"pile_set_name": "HackerNews"
} |
Is Android Leaving the Door Open for Microsoft Windows Phone - ariels
http://aseidman.com/2010/07/android-leaving-the-door-open-for-microsoft-windows-phone-7/
======
sabj
I have to disagree with part of what seems to be the thesis here, which is
that the smartphone market is a zero-sum game of some kind, and that Android
is leaving open a door vs. smothering Windows Phone - I think that's a little
bit too binary. I think there is all along a particular niche for MSFT to leap
into, if it can leverage its worldview successfully.
Microsoft operates through a synergistic platform lock in that makes its world
go round. Office feeds Windows and Windows feeds office; throw in developer
tools as well and it all works together. The more dependencies you create, the
stronger the system becomes -- or at least, that's how it used to be,
especially when market power could be brought to bear. Billions to lose is
valuable (Bing!) but unless you start paying people to take your phones and
create a market, it's hard to start from nowhere, no matter what you're
selling.
This said! I think that if Windows Phone offered super ultra mega strong
support for Office products and the like, there would be a compelling business
/ corporate use-case, above and beyond the competent Exchange support you see
competitors offering. Some differentiation and good execution makes the
difference here, because the problems that face Android are 1) not necessarily
going to persist and 2) not necessarily going to stop its steamroller march to
market penetration dominance.
I think we will see a lot of changed expectations going forward with Android
3.0 gingerbread and beyond. The Android team's stated goal is to do away with
the demand for 3rd party UIs and, while there will doubtless still be those
who want to cover it with ugly and add bloatware, I think there will be
compelling reasons to stick with stock, or at least, not to ruin it.
Besides, what's the prize here? Consider Windows. Setting aside differences
between Mac v. PC and Android v. anyone, the fact that Windows can be skinned
and filled with bloatware hasn't stopped the platform's dominance, or the
Microsoft gravy train. Apple has great margins on their hardware, but they're
not on the fast track to displace Windows, are they? Likewise, if everyone is
selling Android phones filled with crap, well... they'll still be selling
Android phones : ) Though I'd be sad to see that state of affairs, and it
definitely would open up more room for competition.
Finally: "After killing Windows Mobile 6 and the KIN Microsoft finally has
their shit together." Really? Let's hope so :) I am rooting for the folks in
Redmond, because I think more innovation is always best here! I am excited to
see what comes out, but seeing what has happened before, I am not going to be
waiting around for a Windows phone -- for starters, because I just got a
lovely new Incredible a few months ago! :D
| {
"pile_set_name": "HackerNews"
} |
Ask HN: No Link to Hacker News on Reply/Submit - orky56
This is a question to the YC/HN team as much to the HN community. When submitting a new post or replying to an existing thread, the navigation at the top completely disappears outside of a link to YC. I don't have the option to go to Hacker News directly or any of my profile options. This seems to be a disorienting experience and I am forced to use the Back button.<p>Just curious what the reasoning might be for this design decision.
======
sarcasmatwork
Click on the [Y] at the top next to submit.
| {
"pile_set_name": "HackerNews"
} |
After 37 years, Voyager has fired up its trajectory thrusters - lisper
https://arstechnica.com/science/2017/12/after-37-years-voyager-has-fired-up-its-trajectory-thrusters/
======
botskonet
Voyager is probably the _most_ badass awesome thing ever.
Every single aspect of space travel is an engineering/mathematic/scientific
marvel. Not only did we plan, build, launch these, (before I was born) but
we're still communicating (until we can't).
I'm reading links people have posted here, trying to understand _how_ we
communicate with these probes. It's fascinating.
~~~
userbinator
What I find more amazing is the fact that a lot of what could be referred to
as "space-age technology" is actually many decades old, and thus was
accomplished with a fraction of the processing power and knowledge we have
today. Voyager was launched in the late 70s, but based on technology of the
50s and 60s. We visited the Moon almost 50 years ago, using that technology.
If you look at old spacecraft hardware, one thing that stands out is its
apparent simplicity and down-to-earth (no pun intended) design --- and I'd
argue that this is at least partially responsible for its extreme reliability.
From that perspective, I feel as though developments in modern technology just
can't compete for impact; we constantly search for new ways of designing
things, wrapping ourselves in endless layers of abstraction and high-level
thought, yet aren't really "getting off the ground" and _accomplishing
something concrete_ , so to speak.
~~~
matthewmacleod
_From that perspective, I feel as though developments in modern technology
just can 't compete for impact; we constantly search for new ways of designing
things, wrapping ourselves in endless layers of abstraction and high-level
thought, yet aren't really "getting off the ground" and accomplishing
something concrete, so to speak._
I’m not really sure this is true, though I understand why it might feel that
way sometimes.
I’m currently travelling at about 180mph on board a high-speed train in Japan.
I flew here on a jet which is something like 20% more efficient than the
equivalent from a few years ago. Using the ubiquitous LTE network, I can make
a real-time HD video call to my family back in the UK, using my palm-sized,
battery-powered computer. I used the same device earlier to do some research
about cities as we passed through them, and also to check the CCTV system at
home. Over the past couple of weeks I’ve used a similar technology stack to
locate my position to meter-level accuracy, to read and translate foreign
language text from images in real time, and to record hours of 4K video.
Modern technology is _astonishingly_ powerful - and in some ways, the examples
I described above are even more impactful to me on a day-to-day basis than
space exploration is. Don’t get me wrong - the latter is still important and
exciting! But it’s sometimes too easy to forget the impact of the somewhat
more mundane technology that’s all around us.
~~~
Teichopsia
You may have missed the last word of his previous sentence. _reliability_.
Which is at least to me, the way I understood it.
Something built back then _and_ still works.
Edit: fix markdown syntax.
~~~
djsumdog
But they're different systems, designed very differently. A high speed train
and a space craft are both designed carefully to maximize their possible life.
A cellphone is, unfortunately, been phased into the economy of consumption and
planned obsolescence.
Consumer electronics from a few decades are not quite cellphones, but not
quite high speed trains or nuclear reactors or space rockets. Many old C64
systems still work or can be restored, and I bet most of our current high end
laptops will continue to work a decade from now (you might need to replace the
battery).
The OP might have been talking about efficiency, and we have gotten a bit
sloppy with that in the consumer world (why does Slack/Atom/Discord need to be
a 100MB+ app bundled with its entire web browser and framework? It's like
we're in the 2000s with 15 copies of the JDK on your system again!), but once
again .. different uses.
A modern SpaceX craft is going to have custom real time operating systems
designed specifically to preform much more complex calculations than we've
done in previous space missions, hopefully increasing reliability and the
amount of sensors we can read, record and transmit data for. The software
engineers might be less space efficient in their code than the previous
generation, but if the hardware is cheaper and we can increase readability at
the expense of memory, why not do it?
In Kim Stanley Robinson's Mars trilogy (highly recommend; best Sci-Fi I've
ever read), humans eventually create AI so complex it can manage space
factories designed to build from asteroids. The most advanced AI ever created
is used to maneuver an asteroid into orbit of Mars while also mining the
interior and constructing the cable that would eventually turn into the space
elevator over the course of a decade.
~~~
nojvek
Will deffo check the book out.
Electronics aside, voyagers nuclear energy supply fascinates me.
I'm still very optimistic that someday we'll figure out safe micro nuclear
reactors.
The energy density of nuclear fuel is just amazing.
~~~
branko_d
Marry such a nuclear reactor to something like VASIMR engine, and suddenly the
entire Solar System opens-up to us.
------
Reason077
For anyone with an interest in Voyager, I highly recommend Emer Reynolds film,
"The Farthest". This documentary is brilliant, beautiful, and funny - and I
guarantee that you will learn something that you didn't know about Voyager!
For those in the UK, it's currently streaming on iPlayer - though it should
ideally be seen in cinemas to be fully appreciated.
[http://www.imdb.com/title/tt6223974/](http://www.imdb.com/title/tt6223974/)
~~~
sbmthakur
Torrent link for where it's not available:
[https://1337x.to/torrent/2379692/The-Farthest-2017-WEB-
DL-x2...](https://1337x.to/torrent/2379692/The-Farthest-2017-WEB-
DL-x264-RARBG/)
~~~
teddyh
.onion link to torrent:
[http://uj3wazyk5u4hnvtk.onion/torrent/18374376/The.Farthest....](http://uj3wazyk5u4hnvtk.onion/torrent/18374376/The.Farthest.2017.DOCU.1080p.WEB-
DL.AAC2.0.H264-PreBS)
Magnet link:
magnet:?xt=urn:btih:cb7b9a2f7f318c9f84da467e3963c0fd8a31eb3d&dn=The.Farthest.2017.DOCU.1080p.WEB-DL.AAC2.0.H264-PreBS&tr=udp%3A%2F%2Ftracker.leechers-paradise.org%3A6969&tr=udp%3A%2F%2Fzer0day.ch%3A1337&tr=udp%3A%2F%2Fopen.demonii.com%3A1337&tr=udp%3A%2F%2Ftracker.coppersurfer.tk%3A6969&tr=udp%3A%2F%2Fexodus.desync.com%3A6969
------
ramshanker
So, they have all the codes and manuals to be able to control the probe after
37 years. More than that people able to operate those manuals.
DOCUMENTATION FTW.
~~~
DerfNet
Meanwhile, after two years my Android phone will brick itself.
~~~
Florin_Andrei
> _Meanwhile, after two years my Android phone will brick itself._
I understand the sentiment, but your phone was not made to exit the solar
system.
~~~
djsumdog
But they should be .. or at least last as long as an old C64. You can put
newer Windows on pretty old hardware, and for really old hardware you can
always slap Linux on and it will still be useful. There are still Kernel forks
to support 386 processors!
Cellphones are a mess because we can't even have a nice base hardware
platform. ARM isn't a platform. It's a SoC spec with random shit soldered to
random pins by different vendors with completely non-upstreamable kernels.
Google could just mandate UEFI on OHA phones like Microsoft did with theirs,
but instead we're just getting this /vendor partition in the next release.
I don't think it's unintentional either. It's an aspect of planned
obsolescence. The cellphone industry wants you to upgrade every two years,
when we should not be destroying the planet and creating gear that lasts 10
years. Fewer factories, less pollution, longer life .. but we're in a
consumerist economy hardwired the opposite direction, where any type of profit
shortfall or lack of growth is seen as a problem, not the result of a good
product.
~~~
pbhjpbhj
Agreed, but if a public company makes stuff to last 10 (why not 20) years then
they'll go bust because capitalism requires profit and sustainable ideals are
contrary to profit.
What we need is a privately held cellphone company that will forgo profit in
favour of creating long lasting, repairable, maintainable devices. [I've been
working on this thesis for the transition from capitalism to communism]
Meanwhile I'm wearing a 25 year old tshirt, whilst tshirts bought much more
recently wear out and get holes in.
------
nebulus
In the Epilogue of "Murmurs of Earth" (1978), Sagan writes:
"It is a difficult computer task to calculate what stars might by chance be
along the Voyager spacecraft trajectories 50,000 or 100,000 years from now.
Mike Helton of the Jet Propulsion Laboratory has attempted to make such a
calculation. He calls attention in particular to an obscure star called AC+79
3888, which is now in the constellation of Ursa Minor -- the Little Bear, or
Little Dipper. It is now seventeen light-years from the Sun. But in 40,000
years it will by chance be within three light-years of the Sun, closer than
Alpha Centauri is to us now. Within that period, Voyager 1 will come within
1.7 light-years of AC+79 3888, and Voyager 2 within 1.1 light-years. Two other
candidate stars are DM+21 652 in the constellation Taurus and AC-24 2833 183
in the constellation Sagittarius. However, neither Voyager 1 nor Voyager 2
will come as close to these stars as to AC+79 3888.
"Our ability to detect planetary systems around other stars is at present
extremely limited, although it is rapidly improving. Some preliminary evidence
suggest that there are one or more planets of about the mass of Jupiter and
Saturn orbiting Barnard's star, and general theoretical considerations suggest
that planets ought to be a frequent component of most such stars.
"If future studies of AC+79 3888 demonstrate that it indeed has a planetary
system, then we might wish to do something to beat the odds set by the
haunting and dreadful emptiness of space -- the near certainty that, left to
themselves, neither Voyager spacecraft would ever plummet into the planet-rich
interior of another solar system. For it might be possible -- after the
Voyager scientific missions are completed -- to make one final firing of the
onboard rocket propulsion system and redirect the the spacecraft as closely as
we possibly can so that they will make a true encounter with AC+79 3888. If
such a maneuver can be effected, then some 60,000 years from now one or two
tiny hurtling messengers from the strange and distant planet Earth may
penetrate into the planetary system of AC+79 3888."
We know so much more about exoplanets today than we did in Sagan's time, and
have so much more computing power to bring to bear. Knowing the trajectory
thrusters still work, it would be a fitting tribute to try one last
interstellar bank shot into the corner pocket, and see if we couldn't honor
Sagan's last wishes, and give the Voyagers a destination worthy of their
journey and their cargo.
~~~
kgilpin
60,000 years ago, humans were just leaving Africa. So I would be impressed if
60,000 years from now, anyone remembers that Voyager is still out there!
~~~
MarkMc
I really wish I could see the progress humans make over the next 60,000 years.
Shout out to all my ancestors in 62017!
~~~
Narishma
I think you mean descendants.
~~~
rezwrrd
There was an accident involving a contraceptive and a time machine.
------
ridgeguy
Kudos to Aerojet Rocketdyne on their thrusters lighting up after 37 years.
That's genuinely impressive engineering.
Certain Volkswagen models apparently have an even more amazing MTBF. It's a
real sleeper. [1]
[1]
[https://www.youtube.com/watch?v=Xo2kSu6O8cU](https://www.youtube.com/watch?v=Xo2kSu6O8cU)
But Voyager's got it beat on mileage...
~~~
oh_sigh
Not just 37 years, but 37 years in near absolute temperature.
------
monster_group
Can somebody with knowledge of radio communication explain how we are able to
send a radio signal to a destination that is 21 billion Kms away? How powerful
does the signal need to be? What kind of technology is used to generate such a
powerful signal?
~~~
InclinedPlane
There are high gain antennas on both ends, and use of error correcting codes,
as well as _extremely_ high grade low noise amplifiers on the Earth side.
The Voyagers have a 3.7m diameter parabolic radio dish, larger than the Hubble
space telescope's mirror even. That alone provides a huge amount of gain on
communications. Additionally, the spacecraft have 10s of watts of power
available for transmitting signals, which is a fair bit considering (while on
the other end the ground stations have up to hundreds of thousands of watts to
transmit). The ground-stations in the deep space network (DSN) are tens of
meters across, a small antenna is 34m, the biggest ones are 70m across. That
also provides a _huge_ amount of gain alone. It means that there is more area
to collect signals from the spacecraft and it means that the beam from the
ground station to the spacecraft is much tighter, concentrating the total
transmission power into a smaller cross-sectional area at the distance of the
spacecraft.
The spacecraft also uses error correcting codes, which involve transmitting
many more bits than the underlying data, but in such a way that errors due to
noise are not only detectable but correctable.
On top of all of that you have the state of the art low noise amplifiers in
the DSN antennae. A typical low noise amplifier is a carefully built
electronics assembly made by experts. The DSN amplifiers? They use 99.95%
purity ruby rods chilled to 4 degrees above absolute zero to form microwave
MASER based amplifiers.
There's a neat little video (series) here on the DSN and contacting the
Voyagers:
[https://www.youtube.com/watch?v=FzRP1qdwPKw](https://www.youtube.com/watch?v=FzRP1qdwPKw)
~~~
femto
There's also a neat book for those interested in the engineering details of
their receivers:
"Low-Noise Systems in the Deep Space Network" Edited by Macgregor S. Reid
[https://descanso.jpl.nasa.gov/monograph/series10/Reid_DESCAN...](https://descanso.jpl.nasa.gov/monograph/series10/Reid_DESCANSO_sml-110804.pdf)
It's published by the JPL as part of the "Deep Space Communictions and
Navigation Series". The rest of the books in the series, listed in the book's
front matter, have some fascinating titles.
------
sohkamyung
Remarkable. I'm vividly reminded of that scene in the movie, Apollo 13, where
this is said: _[Gene Kranz:] I want you guys to find every engineer who
designed every switch, every circuit, every transistor and every light bulb
that 's up there. Then I want you to talk to the guy in the assembly line who
actually built the thing. Find out how to squeeze every amp out of both of
these goddamn machines._
NASA has a remarkble group of engineers who know how to get every last erg of
energy out of that machine.
~~~
botskonet
I think Apollo 13 did a lot to show what talented engineering teams can do.
They also didn't shy away from real terminology and rarely stopped to explain
it. No other movie has done that in my experience.
------
jacquesm
Very impressive. Every day still the envelope of what mankind has touched is
still expanding because of this craft. It's a tiny mark on a vast universe but
for some unspecified reason it makes me feel very happy to know that it is out
there and still ticking, I am not looking forward to the day that it
eventually will shut down but even as an inert man-made mass that far out it
will be an amazing accomplishment.
~~~
lvspiff
I am not looking forward to the day when it somehow returns, starts blowing up
star ships, ultimately taking over one of them, and tries to seduce the crew
~~~
hota_mazi
That's actually the plot of the first Star Trek movie (without the seduction
part).
[Spoiler below, hopefully not a huge deal for an almost forty year old movie)
Voyager returns under the name v-ger.
~~~
jcims
Back in the early 90's I wondered why so many Unix boxes had 'vger' in their
name. Not until this year, when I watched the original Star Trek with my
youngest, did I realize what it was all about.
Sad thing is I saw the movie in the theater when it was first released...just
didn't put the two together.
~~~
noir_lord
The first two Linux machines I ever had to admin back in the 90's where called
Picard and Kirk (Picard was the primary, Kirk the secondary (no guessing which
series I preferred)).
The NT machine was called Locutus.
~~~
ajmurmann
The "Locutus" is so perfect!
------
emptybits
The round-trip request/response time for a command is 39 hours (!) ... I guess
the team has normalized to that latency, but when asking a system to fire with
paths of execution plus hardware plus propellant/etc that haven't been used
for 37 years ... I can only imagine the tension and celebration.
Related question ... can an easily amateur listen to response transmissions
like this that come back from probes?
~~~
ohazi
I had a chance to visit the Deep Space Network facility at Goldstone several
years ago (definitely worth a visit if you can arrange one). The antennas they
use vary in size from 26 to 70 meters in diameter, and even then they need to
use lots of process gain because the signal is still several dB below the
noise floor. Amateur tracking would probably be incredibly difficult.
~~~
PoachedSausage
You will definitely want to be doing cryogenic cooling of your receiver front-
end to reduce the noise figure[0].
[0]
[https://en.wikipedia.org/wiki/Noise_figure](https://en.wikipedia.org/wiki/Noise_figure)
------
soheil
In the spirit of what this community's name suggests it's about, how hard
would it be for someone to hack into the spacecraft by sending it a signal to
turn on its thruster too soon given the technology is 40 years old?
Has anything like this been tried before with other spacecrafts?
~~~
ewams
Know the big dish from James Bond? Used legit:
[http://observer.com/2014/08/civilians-in-abandoned-
mcdonalds...](http://observer.com/2014/08/civilians-in-abandoned-mcdonalds-
seize-control-of-wandering-space-satellite/)
~~~
rurban
Nope, Arecibo is not used for this. The Chinese FAST would be perfect for this
also.
~~~
ewams
You are correct this wasn't the article I was looking for, didn't read it to
verify. I can't find it now but 1-2 years ago some people used that dish to
reactivate a "lost" satellite that had not been used for several years.
------
muxator
Wild engineer's dream: where can I work on something of the same coolness
level? High scientific content, total quality, no compromises whatsoever.
~~~
ISL
I'll burst the quality bubble now:
There were absolutely compromises, and the engineers who shipped it knew
myriad ways in which it could be better.
To answer the question:
Find something you think is equally cool, learn what you can about the field,
be open to the possibility of a pay cut, and start knocking on doors.
Sometimes doors open. Work really hard when they do.
~~~
InclinedPlane
Two months before the Voyager probes were shipped for launch it was found out
that Jupiter's magnetic fields and radiation environment were a lot stronger
than anticipated, in excess of what the spacecraft were designed to tolerate.
They ended up wrapping many of the cables on the spacecraft with aluminum foil
bought from a local supermarket as a protective measure.
------
outside1234
I love that every time I read one of these articles it is extending the life
of the mission by 2-3 years. :)
My favorite use of tax dollars ever.
~~~
elliotec
What do you mean? Any context here?
------
mark-r
When you're far enough from the sun to avoid thermal cycles, in the vacuum of
space, what forces will cause things to deteriorate? If I wanted something to
last forever, deep space would be the perfect place for it.
~~~
ISL
\-- It is a tough radiation environment. Materials are embrittled and altered
over time.
\-- Many materials age on their own; plasticizers outgas from plastics. You're
probably familiar with how once-pliable but now old/aged plastics, even kept
sealed in a box, can become crunchy and brittle.
\-- Chemistry doesn't stop in space. The thrusters are likely to involve
chemically-reactive materials. Any little bit of corrosion, stress-corrosion
cracking, etcetera could cause anomalous performance.
It is wonderful when things still work.
~~~
jacquesm
And then there are vacuum welding and interstellar dust.
~~~
comstock
Interesting, I didn’t realize cold welding was a problem in space. I assume it
can be solved by applying a coating to surfaces that might come in contact
with each other.
------
manmal
How does Voyager know where Earth is, such that it can position the dish
correctly? Does it use inertia to keep track?
Also, at such an enormous distance, I’d expect very minor dish positioning
errors to result in the loss of the line. It’s awesome that they had the
skills to build something like that in the 80s.
~~~
jacquesm
It doesn't need to point its dish at Earth, it can simply aim for the Sun. It
is so far away they are as good as in the same position relative to itself.
And the Sun conveniently lights up making aiming a lot easier.
~~~
manmal
Did they have digital light detectors in the 70s? I mean, it’s trivial now to
aim something at the sun now, but they didn’t have image recognition then. A
primitive heat sensor perhaps?
~~~
jacquesm
Yes, CDS cells are much older than that, and Ge diodes are light sensitive
too.
------
acheron
Primary source here:
[https://www.jpl.nasa.gov/news/news.php?release=2017-310](https://www.jpl.nasa.gov/news/news.php?release=2017-310)
------
jhallenworld
Voyager's antenna is aimed at the sun (because at its distance, earth and sun
have close angular separation). So this means the command signal has to
overcome microwave noise from the sun.
So this is the same problem as SETI. I know for SETI they reduce stellar noise
by assuming that aliens are transmitting in very narrow bandwidths.
------
tempodox
Fascinating. Now, that's what I call Engineering. And +1 to the author for
using the metric system :)
------
oskarth
Lisping at JPL (Jet Propulsion Lab) is an interesting read:
[http://www.flownet.com/gat/jpl-lisp.html](http://www.flownet.com/gat/jpl-
lisp.html)
It is written by lisper at HN who submitted this article :)
------
byron_fast
The speed of light sucks. What patience.
~~~
b3lvedere
It's way too slow :)
[https://www.youtube.com/watch?v=1AAU_btBN7s](https://www.youtube.com/watch?v=1AAU_btBN7s)
------
Retric
19 hours and 35 minutes * speed of light is a long way.
It's not pointed in the right direction, but it's interesting to think of this
as just over 1/2000th the distance to the nearest star.
------
Cshelton
Would it be possible to use these thrusters, or even the main thruster, to
speed Voyager or slingshot it around something right before it loses power to
communicate back to us?
It'd be cool if right before we lose contact of it for good, we got it to go
as fast as possible so that it will reach who knows where someday... just
slightly faster.
~~~
ceejayoz
> slingshot it around something
It's in interstellar space. Barring a really lucky encounter with another
interstellar body, it'll be another 40,000 years before it's close to
anything.
~~~
Cshelton
I mean, we are finding Pluto sized objects right outside our solar system all
the time now.
~~~
eesmith
It doesn't seem like it's all the time. Pluto is the largest, Eris is close.
Haumea, (225088) 2007 OR10 and Makemake are the only others at least 1/2 the
diameter of Pluto.
We are finding more trans-Neptunian objects. There's about 2,500 of them. But
as Adams said, "Space is big. Really big. You just won't believe how vastly,
hugely, mind-bogglingly big it is."
Voyager can only be angled a smidgen from it's current path. There's nothing
it can reach to do a slingshot.
If Voyager could change its path by 0.1 degree (which it can't), and if all of
the trans-Neptunian objects were equally distributed around the Sun (which
they aren't - and Voyager is going out of the plane of the ecliptic), then
that's still only a 0.2% chance of having something in its path.
Even if there were, Pluto gave New Horizons about 5-6 m/s boost. That's
effectively nothing compared to Voyager's 17 km/s speed.
[https://www.youtube.com/watch?v=Hm6ga-g9ACU](https://www.youtube.com/watch?v=Hm6ga-g9ACU)
via [https://space.stackexchange.com/questions/10087/did-the-
plut...](https://space.stackexchange.com/questions/10087/did-the-pluto-flyby-
give-new-horizons-any-significant-gravitational-boost) .
------
signa11
just as a scale, this distance is not even a light-day away ! i am just
_amazed_ that we can still "hear" it.
anyone have more info on the power with which the signal arrives here at earth
? and how do they make sure that ambient/thermal noise does corrupt it ? thank
you !
~~~
calgoo
You can normally see them communicating here:
[https://eyes.nasa.gov/dsn/dsn.html](https://eyes.nasa.gov/dsn/dsn.html)
------
keithflower
There's a wealth of fascinating information including book series on deep
space communication at the JPL DESCANSO Deep Space Communications and
Navigation Center of Excellence:
[https://descanso.jpl.nasa.gov](https://descanso.jpl.nasa.gov)
------
lordnacho
What does it fire out when firing the thrusters? How come there's propellant
left?
~~~
iaw
All the references I found Wikipedia said it used Hydrazine thrusters.
As to why there's propellant left: it's a sealed system and NASA has
intelligent engineering practices when it comes to resource planning.
~~~
garmaine
No seal is perfect, especially without standard pressure surrounding it.
~~~
mrguyorama
The fact that there is propellant left implies that the seals were good enough
------
cmsonger
I'm too old now: but I so wish I could have been a young engineer on such a
project. The title is literally "37 years later." (And that's not from
launch.)
Amazing.
------
make3
Those that make it humanity's farthest controlled object? literally,
humanity's furthest influence
------
joering2
> This week, the scientists and engineers on the Voyager team did something
> very special.
I would say if you still manage to run sophisticated system that is 37 years
old, even more challenging when its out in space, everything you do is
special.
------
verybadvoodoo
Everybody wept when they read this, right?
------
faragon
Voyager mission is so epic. A miracle.
| {
"pile_set_name": "HackerNews"
} |
InfluxCloud: Managed InfluxDB Clusters and Grafana on AWS - pauldix
https://influxdata.com/blog/announcing-influxcloud-fully-managed-influxdb-clusters-on-aws/
======
ramanathanrv
I was expecting this for a while now. But the pricing part isn't very clear.
Is it $149 per instance or for the solution itself? Could be better if the
article elaborates more on this.
~~~
sickeythecat
Check out:
[https://cloud.influxdata.com/users/sign_up](https://cloud.influxdata.com/users/sign_up)
you can adjust the configuration to suit your needs and the pricing moves up
or down accordingly.
| {
"pile_set_name": "HackerNews"
} |
Why do you use Foursquare (or Gowalla)? - nicholasreed
What is the appeal of Foursquare/Gowalla? Is the value in the competition, or exploring new places? What makes you pull it out of your pocket to check-in?
======
jasonlbaptiste
I deleted both off of my iphone. Random people Ive never met started adding me
on gowalla. Foursquare was cool, but the same people kept checking in
everywhere they went (gamestop, starbucks, pollotropical, etc.). The game
aspect was fun. I can say, there were 100x more people in the bay area using
it which showed the possible potential. The most fun I've had with it?
Screwing around with a group of 8 people and checking into cars, rival houses,
and causing chaos.
I do think the concept of checkins and such is something VERY powerful for
businesses. I think there's more use in "geo-physical social bookmarking" ie-
I can favorite and show people what I like, while getting deals from
businesses. The whole mobile social networking thing is just a feature to me
and I'm waiting for facebook to do it. What delicious did for internet
bookmarks, is what I see foursquare or some service like it doing in the
future for physical objects.
~~~
nicholasreed
Thanks J. I absolutely agree that the check-in concept can be very beneficial
to businesses, both for understanding customers and attracting new ones.
------
hackworth
i use gowalla (prettier) mostly just to kill time while i wait in line or for
the bill to arrive. and to stalk my friends. not much use beyond that, to me.
| {
"pile_set_name": "HackerNews"
} |
San Francisco median one bedroom apartment rent hits a new peak of $3,690 - vector_spaces
https://www.cnet.com/news/san-franciscos-outrageous-rent-hits-a-new-peak-highest-in-the-us/
======
dthal
Note that Zumper is _not_ a reliable source for median rent data.
CityObservatory wrote an article about their data problems a few years ago
[1]. Zumper's data is, of course, based on apartments that are for rent and
doesn't include currently occupied units. That alone skews high, especially
when there is a lot of higher-rent new construction hitting the market. Also,
it looks like Zumper's data skews towards higher-end neighborhoods.
For a broader look at the rental market, including occupied units and rent-
controlled units, you could just consult ACS data. That says that median rent
for all occupied 1-bedrooms in San Fransisco was $1912 in 2017 [2].
[1] [http://cityobservatory.org/journalists-should-be-wary-of-
med...](http://cityobservatory.org/journalists-should-be-wary-of-median-rent-
reports/)
[2]
[https://censusreporter.org/data/table/?table=B25031&geo_ids=...](https://censusreporter.org/data/table/?table=B25031&geo_ids=16000US0667000&primary_geo_id=16000US0667000)
~~~
fountainofage
I guess it depends on what question you're asking. You seem to be pointing out
that you think the real question is: "what is the median rent for anyone
living in San Francisco?" whereas Zumper seems to be more answering the
question: "if someone moved to San Francisco today, and has zero connections,
what rent would they pay?" Both are valid questions, but I think it's good to
set the context for if you're having a conversation about "general median
rent" vs "newcomer median rent."
~~~
ehsankia
Even then, it's not as clear. First off, median caries the implication that
these are all "valid" places, whereas in reality, most apartments that are
still up for grabs probably aren't rented for a reason, therefore including
them in a median is misleading.
Next up, the data itself needs to be "fresh" for this to work. It could be
that cheaper apartments appear on the market frequently, but are rented right
away. So as someone new in the city, you could find these cheap places if you
looked for a bit, but a single survey will probably not catch these.
A better approach would be to look at the median of all _new_ rented
apartments throughout the year. That will give you an idea of what someone who
comes to the city will realistically pay.
------
askafriend
On one hand, it’s a very high price. On the other hand, no one is entitled to
a one bedroom unit in the heart of one of the human epicenters in the world.
Many people here have roommates or alternate living arrangements.
There are several reasons why the prices have gotten so high (lack of supply,
politics, etc).
But don’t expect to move your Texas ranch or Midwestern single-family-house
lifestyle to SF with no compromises.
But in return for your compromises, you will get quite a bit in return.
Is it worth it? That’s a very personal decision. For many, it’s not. For many,
it is.
~~~
rayiner
> On one hand, it’s a very high price. On the other hand, no one is entitled
> to a one bedroom in the heart of one of the human epicenters in the world.
> Many people here have roommates or alternate living arrangements.
New York is the “human epicenter” in the US. San Francisco is in the same tier
as Houston or DC:
[https://www.lboro.ac.uk/gawc/world2018t.html](https://www.lboro.ac.uk/gawc/world2018t.html).
Paying New York/London prices to live in a city that is not New York or London
is just sad.
~~~
genidoi
SF maximizes technological intelligence, Wall Street maximizes financial
intelligence. Plenty of overlap, but the top talent stays where the rest of
the top talent is.
~~~
rayiner
Having technological intelligence doesn’t make San Francisco a “human
epicenter” (whatever that is). I wouldn’t even call San Francisco a
technological epicenter. Even within computer technology, much of the most
important technological intelligence in the US is outside San Francisco: in
Portland (Intel), Seattle (Microsoft), San Diego (Qualcomm), and maybe San
Jose (Cisco). Put differently, what cities would be most devastating (in terms
of impact on US technological capabilities) to lose early in WWIII? I’d put
Boston or Seattle ahead of San Francisco.
~~~
Cyclone_
Agreed, SF has a lot of tech companies, but a lot of them don't do things that
ate very valuable for society, many are social networking/advertising.
~~~
rayiner
I think consumer tech is valuable in the sense that our peace-time economy is
based on consumption, and San Francisco is great at helping us consume. But
when I hear the word epicenter I’m thinking “what’d put us back in the Stone
Age if we didn’t have it?”
------
NTDF9
I just visited New York on an extended trip.
Most of NYC is better connected, more vibrant, more concerts, more sports,
more universities, more housing, more international people, more business
activity.
Yet, NYC is affordable with a "functioning" commute system. People on all
spectrum of income can live there.
SF bay area has none of all that and yet is super expensive wasteland of
suburbia. SF city is better but not even comparable to major cities of the
world.
The only thing SF has going on for it is weather, outdoors (for which you need
a car and a parking spot) and tech companies.
For any young padawans considering SF, it has its attractions but you won't
know what a bubble it is until you get out. It's not worth spending one single
life in the bubble.
~~~
sonnyblarney
SF is not the Bay Area, the data often gets confused.
And SF is oddly very, very different from San Jose or Palo Alto - they're
literally different climates, like 10 degrees differential on most days!
SF proper is a very small city, less than 1m people!.
~~~
NTDF9
That is true. That's why I wrote:
>> SF bay area has none of all that and yet is super expensive wasteland of
suburbia. SF city is better but not even comparable to major cities of the
world.
Most SV jobs are in the bay area. Even concerts, sports, shows in SF are in
bay area not in SF proper.
SF proper is smaller than Jersey City or Brooklyn, with lesser walking
stores/restaurants, lesser community activities and anyone not in a rent
controlled apartment cannot live a life without stress in SF proper.
------
yingw787
I think high rent prices are bad for a number of reasons:
Less flexibility in starting a company: You can't choose to bootstrap unless
you do it as a side hustle at a megacorp, and it's hard to not compete with a
megacorp because it does everything. You have to take VC funding and follow VC
rules to some degree, and it may not be a direction you wish to follow. This
is a feedback loop.
Oligopsony employment conditions: If you wish to build savings and not spend
all your money on rent, you can really only work for the largest and best
funded companies, and even then you need to negotiate at a disadvantage, since
they have far more information on the employment markets (because they're big)
and they each know the other big players (because they're few) and can bound
salaries through price leadership. This is also a feedback loop.
I'm concerned we will see fewer "real" companies like Intel/Apple, and more
Snapchat clones and e-scooter startups. Less meaningful bets, less drive, and
more complacency -- borne out of the desire to cannibalize social trust (e.g.
if you work hard, you earn shelter) and the societal fabric (e.g. all people
deserve shelter as a basic human right) for privatized financial gain.
~~~
whatshisface
> _all people deserve shelter as a basic human right_
I hear this a lot and it still makes absolutely no sense. Where will they get
the shelter from? What if the builder doesn't want to build a house for that
person? Is the plan to force the labor out of them? Every positive right that
I have ever seen comes along with an implicit demand that somebody else become
a slave.
~~~
Gibbon1
What will you do when it happens to you? That you have no place to live?
Because it will.
~~~
GhostVII
What makes you think that will happen to them? Most people don't end up
homeless at some point in their life.
~~~
Gibbon1
More people than you think and it gets more common over time.
------
ksec
Excuse my ignorance. SF doesn't look anywhere like Hong Kong, possibly the
most crowed places on earth, and it is not close to Tokyo or Seoul. It has
plenty of Spaces, for many coming from Asia, most part of SF is like country
side so to speak.
What cant there be more housing development on the edge of SF, or taller
building with many smaller apartments etc. Underground packing lot?
Edit: Ok, is it because it is in an Earthquake zone?
~~~
olalonde
From what I've heard it's very difficult to develop new housing in San
Francisco due to various regulations. Apparently, you even need to apply for
permits to change your windows. Having lived in Hong Kong and Shenzhen, San
Francisco does feel like country side.
~~~
Gibbon1
> Apparently, you even need to apply for permits to change your windows.
Needing a permit to install windows is a thing everywhere. Because if you want
a really good way to ruin a house, installing windows wrong is a great way to
do it.
~~~
olalonde
Everywhere in the world or everywhere in California? If the former, I find it
amazing that house owners ruining their house by replacing windows is a
universal problem and that requiring permits solves it.
~~~
Gibbon1
Pretty much every developed part of the world requires permits for such
things. Why permits? Because building codes are as old as civilization.
[https://www.lexology.com/library/detail.aspx?g=5181e80b-f307...](https://www.lexology.com/library/detail.aspx?g=5181e80b-f307-42e6-a357-c2d081b678ff)
------
towaway1138
When headhunters contact me about SF (or even Bay Area) jobs, I just scoff and
shut them down. If you're fresh out of school, no family, and willing to live
in a hovel for some interesting life experience, sure. But if you're an adult,
it's no way to live. And putting kids through it is frankly unethical.
~~~
stale2002
Well it depends.
I'll agree with you that if you work at a rando tech startup, the costs of
living are going to be quite high, and make it probably not worth it.
_But_ , the whole equation changes if you are working for a big 5 tech
company, or _hot_ unicorn company.
These companies will pay you double what the rando startups pay. Who really
cares if you are paying 40k a year on rent, if your total compensation breaks
250K.
And getting 250k+ as an engineer, with a couple years experience, is only
really possible in the Bay area.
~~~
moftz
But the cost of living is so high there that the inflated salaries are pretty
much required to bring anyone out there. A company wants to have competitive
hiring so they offer a little bit more than competition so what was a $200k
job is now $220k. Prices increase to what these inflated salaried workers can
afford and the cycle continues. It will be incredible to watch when SF finally
starts to allow for more high density housing. Housing prices will fall and
the salary bubble will pop. A lot of companies will start cutting back on
starting salaries and as a result, experienced workers will start either
taking paycuts to switch jobs or be first in line for layoffs because they
cost the company too much.
~~~
nostrademons
This gets causality backwards. Employers don't give a shit about whether
employees can afford housing - if they did, we wouldn't have such a big
homelessness problem, and our teachers/firefighters/policemen wouldn't live
over the mountains in Pleasanton. Employers pay enough to prevent employees
from going to a competitor. The reason FANG salaries now hit $300-400K is
because Facebook refused to play ball with the wage-fixing cartel that
Schmidt/Jobs/et al had worked out, then the DoJ cracked down, and so now
that's what you need to pay to keep your software engineers from working for
someone else. Then you have a lot of tech workers with extremely fat salaries
and not enough housing for everyone who wants to live here, so they bid up
rents & housing prices until they're unaffordable for all the middle-income
folks who used to live here.
If more high-density housing was built, then rents would fall, middle-income
people could live here again, and tech workers could pocket their fat salaries
as savings (or live extremely lavish lifestyles). They would still be paid a
lot, because there'd always be another tech company around the corner willing
to bid up wages to their current level. For the salary "bubble" to pop, a lot
of these companies would have to go out of business at once. This was exactly
what happened in 2001 and 2009, and it had the predicted results: unemployment
spiked, wages fell, rents fell, and people moved out of the Bay Area back to
wherever they came from.
~~~
ascar
If that would actually be the case you would have these salaries all around
the world, but you don't. Because it's mainly a few big highly profitable
companies (FAANG) that can afford these salaries and a big VC and stock market
funded bubble around them. And even FAANG pays much lower salaries in cheaper
regions, because hightech employers have to give a shit about housing. You
can't compare it to teachers.
I think it's a dangerous attitude to think the salaries in CA are the actual
worth of a software developer's work. It's just what FAANG can afford to pay.
For many companies devs at that price point are not economically sustainable.
~~~
nostrademons
It absolutely is what FAANG (and newer companies like
AirBnB/Uber/Lyft/Stripe/DropBox etc) can afford to pay. And what FAANG can
afford to pay is not going to change if more housing gets built in the Bay
Area. And yes, for many companies devs at that price point are not
economically sustainable, but that's why FAANG are sucking all the oxygen out
of the startup developers market and maintaining their oligopsony.
(Also, while there's some variance for cost-of-living in FAANG salaries in
different regions, it's not _that_ much. Google developers make less in
Beijing than they do in Mountain View, but their salary still looks much more
like a Google salary than a Beijing salary.)
------
sonnyblarney
It's an existential issue.
"Peter Thiel: The vast majority of the capital I give companies is just going
to landlords" [1]
[1] [https://finance.yahoo.com/news/peter-thiel-vast-majority-
cap...](https://finance.yahoo.com/news/peter-thiel-vast-majority-capital-give-
companies-just-going-landlords-134709786.html)
~~~
adrianN
He could try giving capital to companies in other locations. Rent in Berlin in
less than a third of what you pay in SF.
~~~
gumby
True, but having lived in Berlin (Prenzlauer Berg) and Bay Area (Palo Alto) I
have to tell you the scenes, and intensity, are very very different. SV in
particular (Moreno than SF) is _very_ tech oriented while Belin is more
e-commerce and food delivery.
Thiel in particular is German and has made some particularly scathing comments
about the German startup scene (to the German press).
------
pcurve
"The federal Department of Housing and Urban Development said last year that a
family of four earning up to $118,400 qualified as "low income" in the city. "
For comparison, in NYC, it's $83,450.
~~~
cm2012
Median household income in NYC is 50k, which is an interesting contrast.
------
sys_64738
I always wonder how those who are not in the high paying jobs can afford to
live there. I'm talking about police, fire, EMS, coffeehouse workers, etc.
~~~
jdavis703
They don’t. For example in nearby Atherton the cops have made makeshift
sleeping quarters in the police station because they cannot afford to live in
the city they work. Many of the homeless in tents or RVs have jobs in the
city, the sole reason they’re homeless is because there are no homes
affordable on a low-wage job.
~~~
puranjay
Why aren't these people out in the streets protesting this? It seems so wrong
~~~
rightbyte
It takes a while before you relize it's wrong and unsubstainable. And then
it's too late becouse you have been a waitress or what ever for 3 years and
when you move out there are plenty of people willing to take your place before
they realize it and so on. Big city hype.
And there are some real natives with rent control housing.
I lived in Palo Alto for half a year before I even though about how the
grocery store clerks could affor living there (which they don't). You also
notice the lack of children after a while.
------
refurb
The data is suspect. Anecdotally, and throw browsing the Craigslist
classifieds, the rental market seems to have softened some since the peak a
year or two ago.
Specifically looking at the Mission, there was a time that a sub-$3000 one
bedroom was very rare. Now you can find plenty in the $2500-$3000 range and
even some decent ones below $2500.
There is also more inventory.
------
ScottAS
Where I live in Atlantic Canada you can rent the same for $550 USD, or just
buy a brand new modern home for $180k USD. People here work remotely for
companies like Google. The government will also help fund your startup’s
product development through various programs (hundreds of thousands of
dollars).
Always wanted to move to San Fran but starting to feel pretty grateful about
being here.
~~~
shdh
How large is the remote ecosystem there?
------
seanhandley
I have a 4-bedroom house in the country and the _mortgage_ for it is 5 times
smaller.
Absolute madness.
------
superkuh
And in addition to that because it's San Franscisco you don't really own your
own home or apartment. You just have a set of things you are allowed to do on
it on the whim of the city (ie, no airbnb).
------
cm2012
For those who live in SF - what's the cost of a one bedroom rent 1 hour from
most jobs in SF? I know in NYC you can reliably get that for around 1500.
~~~
mavelikara
Walnut Creek is about 40m away from SF on Bart. So door-to-door that'd be
about an hour commute. Safe, picturesque, good schools and vibrant downtown.
The rent there would be about $1800 for a 1BDR.
EDIT: As noted by the sibling post, the reason it is cheap is because Walnut
Creek does not have very many tech jobs. Also, the peculiar thing about the
Bay Area is that about 15 years or so the tech industry oscillates its
epicenter between San Francisco and the South Bay. From Walnut Creek you can
reach San Francisco in an hour, but if you have clients/partners/HQ etc in
South Bay it is one bad commute.
~~~
seppin
> Safe, picturesque, good schools and vibrant downtown
It's completely dead and devoid of culture.
------
scarejunba
You just have to pay attention. Last year this time people said it was $3400,
but I rented for $2300 that May.
~~~
akhilcacharya
It makes me mildly annoyed that people are getting cheaper rents than me in SF
than I am near Boston.
Are these luxury places?
~~~
scarejunba
They are not. It's just a standard 1 br. SF is a bit peculiar. There are some
things I should call out which are different in most rentals in comparison to
luxury rentals:
* No Washer/Dryer in the unit. You'll usually share.
* No parking spot.
* No-one will be around to pick up parcels for you.
* You'll have to put the rubbish bins out for the bin man.
* The stove and fridge and heating won't be using the latest stuff. You're not expecting to have a Nest or a Viking but you won't get a nice stove either, it'll be very basic.
Honestly, it's a downgrade in standard of living for me but not to the point
that it annoys me. Those things were just luxuries. Really the washer/dryer
being in my unit is the one thing I really miss.
------
arisAlexis
Barcelona, Amsterdam. Same/better quality of life
------
gumby
I do t want to install the app just to find out: hasn’t this been true in Palo
Alto for some time? My ex pays more than that and her place is hardly de luxe.
~~~
akhilcacharya
Palo Alto is particularly expensive because of how luxe the area is (from what
I understand).
~~~
gumby
I first moved to Palo Alto in 1984 and can assure you that luxe it is not.
| {
"pile_set_name": "HackerNews"
} |
Locate with IP Geolocation - lamerske
https://getipdata.com/
======
evlinston
Excellent experience. Quite precise geolocation and good support
| {
"pile_set_name": "HackerNews"
} |
How I Learned to Write Electrical Engineering Specifications - e-_pusher
https://iskender.ee/EE-Specs/
======
a12k_
“ became clear to me that learning how to write a specification was going to
be my hardest learn-on-the-job experience yet. Why? Because I was actually
having an unlearn-on-the-job experience.”
—> concept of unlearning to progress is really interesting, it seems
underrated as a mechanism for how on some becomes a better engineer..
“Back during school, and when I first started working, I used to think that
writing specs was a useless exercise. What even was the point? Just design
your system in your head, talk it through with others if you need to, and go
build it. How could you even know all about your system before you start
building it anyway? Any spec you could write would just be incomplete and not
good. Writing specs was an unnecessary, bureaucratic task best to be avoided.
Time spent on doing “paperwork” would be better used by actually building your
product.
The sentiment above is still common in the technology industry, which I think
is a shame. After three years in the industry, I have changed my mind, and now
believe that a having good specs is critical for making your project a
success.”
—> I like this take, it’s a good example of “tradition is a lot smarter than
you think”
.. more takes of where tradition in engineering is actually underrated would
be really interesting
------
sly_emergence
“ Imagine now that your specification gets magically air-dropped into the
Huaqiangbei. Only your spec - not your product. If a month later, successful
clones of your product emerge in the Huaqiangbei, then you have succeeded in
writing a good spec.”
Cool litmus test.
At the end of the day a spec is a communication tool to tell other humans how
to pull off the original vision. It’s more efficient than words and talking.
It’s a medium.
| {
"pile_set_name": "HackerNews"
} |
Show HN: Crowd Puzzle - Realtime puzzles with friends - x0ner
Hey guys,<p>This is my first true release of something I feel is pretty cool to HN. I started building this site with the idea that pieces could be purchased through micro payments. Winners of solving the puzzle would get some sort of prize and could build up credit on the site. My dreams of this were shattered by the ambiguous gambling laws of the U.S. With that said, I have no clue what to do with the site, but I haven't seen the concept anywhere else before.<p>You can register using Facebook or the traditional login and you will be given 50 pieces and 50 guesses. You can use those to reveal tiles on the puzzle (updates realtime) or solve the puzzle when you think you know what it is. There are currently 3 puzzles loaded into the system and they will just cycle through each other if there is a winner. I understand the puzzle subjects do not have a definitive answer, but I was planning on using artwork or people as the material as there is only one answer.<p>Like I said, I am not sure what to do with the site, but someone else might. Even though I have done a lot of testing, I still consider the site to be in beta. If you find any bugs, I would love to hear. Feel free to shoot me an email, spread the site to friends, or comment on what you think. I am collecting metrics on the puzzle solving to help me better understand how a user plays, so hopefully some good will come out of showing.<p>http://crowdpuzzle.9bplus.com/
======
us
Clickable: <http://crowdpuzzle.9bplus.com/>
| {
"pile_set_name": "HackerNews"
} |
How DreamHost Builds Its Cloud: Selecting Microprocessors - randoramax
https://www.dreamhost.com/blog/2016/09/26/dreamhost-builds-cloud-selecting-microprocessors/
======
itomato
I expected more from you, DreamHost. This blog reads like a first-time DIYPC
build retrospective.
> The most common instruction set you will find in a data center is
> x86-64...Almost all consumer laptop, desktop, and enterprise servers are
> made using processors based on x86-64. Because of this, we decided to use a
> processor based on x86-64.
> To pick the processor we wanted to use, we needed to take into consideration
> a few other factors. But… more on that in another post coming soon!
| {
"pile_set_name": "HackerNews"
} |
Tell HN: You know who needs content? Patients in hospitals - caseyalbert
I just met an elderly woman in the hospital today. She has been bedridden for a month, in serious but stable condition. In speaking with her, the thing she longs for the most, besides company, is something to read. Books, magazines, anything positive, uplifting and informative.<p>What I'm suggesting to all entrepreneurs is you could be doing a service in your community, by handing out magazines and books to the many patients who enjoy reading. They have the time to read it.<p>It would also be an opportunity to bring awareness to your own brand. If it's content you produce, maybe you could package that up and distribute that with a stack of other magazines and books (as long as it's not too esoteric).<p>The main reason you would want to do this is to bring relief and a smile to the faces of those who are sick and injured. They are often bored and lonely. Yet, depending on what you love to do and what your passion is, you might be able to share that as well.<p>Maybe someone here can work with this idea.
======
latch
Honest questions (and from these questions you can probably tell that I
haven't spent much time in Hospitals):
1 - Are you saying mere access to content is the problem? Do Hospitals have
limited reading material? Do libraries not provide outreach services to
patients? Do friends/family not bring content?
2 - Is there a need and/or market to create/aggregate content specific for
this audience?
3 - Anything more specific about the content besides being uplifting? Seems
like a broad range of content is needed, which is proving a challenge to my
brain.
4 - Is there a technology angle to this at all? Hospitals aren't well
connected, old people (who account for a lot of people in Hospitals I'd
assume) aren't very tech savvy. I mean, given your audience here, is there
something we can do at a technical level to help?
~~~
Andrenid
As someone who has spent a lot of the last year in and out of hospital:
The problem for me was that all content distribution in the hospitals... the
TV, the magazine trolleys, the "library" (another trolley)... were all
controlled by small private companies who have made deals with the hospital.
The content is old, seriously lacking in variation (you can have any magazine
you want, as long as it's a women's gossip/variety mag), and it's always the
same stuff (a few basic cable TV channels that have been recorded, and set on
loop).
I had ENDLESS ideas while laying in the beds about technology-based ideas that
would make patients a LOT happier, from tablets filled with RSS-fed content
from major media companies (+ blogs), to a free WiFi service (if they bring
their own laptop/tablet/phone) that is sponsored by marketing, to a
subscription-based WiFi connection that gives them proxied web access and
where part of the profits go back to the hospital as a donation.
The truth of the situation is, the hospitals won't allow it. They're scared of
technology because of interference with equipment, and they're scared of any
new content providers in the form of books/magazines/etc because they'll upset
the decades-old partnerships with the current partners, and they're pretty
much just unable to even comprehend or process any "new ideas" due to the size
of the bureaucracy involved at your average hospital.
Again, this is just my view, from my local hospitals. It's different all over
the world, and maybe elsewhere there are hospitals that are more open to
experimenting with new ways to entertain patients.
~~~
axod
FWIW in the UK all hospital beds have their own TV/Internet/Phone device that
is attached to the wall and swings over the bed to where you want it. It costs
money to use, but you can at least watch what you want, browse the web, send
emails etc etc.
My time 'inside' was made much easier having that.
We're lucky in the UK though.
| {
"pile_set_name": "HackerNews"
} |
Ask HN: Does anybody have experience using Ubuntu 14.04 on a Macbook Pro? - agnivade
I have looked around in the internet and there are a couple of blogs showing how to do it. But there seems to be a lot of issues around it too. I dont want to buy a MBP and then regret because Ubuntu does not work on it.<p>How has your experience been ? If there are any good blog posts which has step by step instructions, that would be perfect.
======
merb
I have a Late 2013 Model and currently the support is really good as of know,
a year ago it was really screwed. The instructions are straight forward and
Arch Linux Wiki contains all the steps to resolve every issue.
~~~
agnivade
Thanks. Are you running 14.04 ?
| {
"pile_set_name": "HackerNews"
} |
Twitter Is Pushing Celebrities and Publishers to Stop Using Meerkat - simas
http://techcrunch.com/2015/04/12/this-stream-aint-big-enough-for-the-both-of-us/
======
ultimape
Lost me at 'sources say'.
------
tomglindmeier
Meerkat will have no chance against Periscope with the support of Twitter. In
two years we will hardly remember what Meerkat was. Sad but true.
~~~
zabramow
Not convinced either will last, but Twitter is effectively mashing on a pimple
and making it worse. Prediction: this will help Meerkat's brand + hurt
Periscope's.
| {
"pile_set_name": "HackerNews"
} |
How Did Things Ever Get This Good? (2009) - luu
http://prog21.dadgum.com/51.html
======
apta
Productivity is not the only argument. What happened to performance and
correctness for example?
~~~
Spearchucker
Both ( _especially_ the second) are subjective. In the context of an arbitrary
objective even BASIC may be a good choice.
| {
"pile_set_name": "HackerNews"
} |
Project Rider – A C# IDE - ingve
http://blog.jetbrains.com/dotnet/2016/01/13/project-rider-a-csharp-ide/
======
danielcarvalho
JetBrains is deadly. IntelliJ IDEA is simply the best tool I've ever used for
HTML, CSS, JavaScript, PHP, and PostgreSQL. Incredibly intuitive and it does
it all. It's what Photoshop is for designers.
This immediately gets me stoked for doing my Unity C# development with this.
* Edit. Initially thought this was an entirely new IDE, and not just a plugin. Less excited, but certainly worth a look.
~~~
gelasio
I've tried all of JetBrains tools over and over throughout the years since I
keep hearing about how much some people like their stuff - but every single
time I am disappointed at how over-engineered everything they do seems to be.
Furthermore, their cross platform tools like IntelliJ IDEA always wreak of
badly emulated native components that don't look or behave the way that they
should on any OS.
So for me - Visual Studio 2015 is the best tool I've ever used for HTML/CSS/JS
(and Node.js and C#). It simply outshines everything else I've tried. I guess
if I was forced to use an OS X or Linux desktop, I'd resign myself to using
JetBrains tools because they probably are the best thing you can find outside
of Windows...but as someone who prefers Windows and who wants native Windows
apps that behave idiomatically instead of just fulfilling the lowest common
denominator - VS can't be beat IMO.
~~~
danielcarvalho
I've had literally the opposite experience. Everything I seemed to need,
IntelliJ IDEA magically had. I just ended up uninstalling things like pgAdmin,
for example, since IntelliJ IDEA is just far superior at doing the same thing,
and it's just right there where you code. Convenient. Usually it started out
as, "I wonder if it can..." and then quickly find the feature that just does
what I want.
Even as a Windows user myself, I can't quite articulate what has always
bothered me about Visual Studio in general over the years. It seems to have
it's own language, terminology, and way of doing things that you have to buy
into. And I really don't like that. Let me pick a folder, have that be my
project, and edit text really clever like. That's what I want.
~~~
dsp1234
_like pgAdmin, for example, since IntelliJ IDEA is just far superior at doing
the same thing, and it 's just right there where you code._
You don't mention the two things together, but VS has the server explorer
which you can use to hook in to a database and do administrative tasks (both
design and data viewing)
~~~
gelasio
Also pgAdmin is about as feature-ful as Notepad is so it's not hard to do
something a little better.
Besides server explorer, VS also has SSDT (SQL Server Data Tools) obviously
only for SQL Server, but I've never seen anything even in the same league as
SSDT for any open source database.
------
sjclemmy
This will have to be very good to be better than Visual Studio on Windows.
I've used Visual Studio 2013 a fair bit in the last 12 months and it's a
fantastic IDE. Microsoft are also developing their cross-platform offering
Visual Studio Code, which is a nice text editor and I imagine it's just going
to get better.
I am using another JetBrains product, PHPStorm, on the Mac at the moment for
some Front End Dev, which is another IntelliJ based product. This has recently
become quite sluggish and is starting to annoy me, after all, who wants the
tool to get in the way of the work. I hope this gets sorted soon.
That said, I think JetBrains do a great job focusing on develop tools and I've
used their dotPeek product to untangle the mess left by the absence of release
control in a recent contract.
~~~
louthy
> This will have to be very good to be better than Visual Studio on Windows.
Really? I feel MS has really dropped the ball with VS. VS2013 was OK, but not
great. A lot of random lock-ups in really annoying situations where you can't
even fathom why (like opening a text file). VS 2015 is actually awful - it
wasn't until Update 1 that typing this (below) in C# would pop-up an on screen
exception dialog:
class Foo : Bar
{
public Foo()
:
Pressing enter after the colon would throw the exception consistently. How on
earth was that missed in testing? This is such a common C# pattern for
invoking the base or this constructor.
I often find myself staring at the screen, saying in my head 'What the hell
are you doing now!'. The new DNX project system is unusably slow (compiling)
and the tooling for it is non-existent. And for me it still crashes all of the
time. Other devs on my team are having similar problems.
TFS and TFS explorer is a total joke. The 'offline mode' is slower than online
mode, and also causes crashes. If I lose my VPN connection then I have to
shutdown VS and restart before TFS will reconnect again (this has been an
issue for every version of VS I've used). Luckily we're moving away from that
travesty of a system to git very soon (the VS git integration is actually
pretty good).
I used to feel that VS was a really solid piece of tooling, and I agreed with
your sentiment that I quoted. I don't any more. I will check this out, and if
it works then I'll happily drop VS, two years ago I couldn't ever imagine
myself saying that.
Competition can only be a good thing here.
~~~
arethuza
"TFS and TFS explorer is a total joke."
I've been rather pleasantly surprised at how much support MS are giving git
and github - they offer git hosting on Visual Studio Online and the Azure
documentation is in github.
Edit: Added pleasantly! :-)
------
jsingleton
I'm unreasonably excited by this news. ReSharper is already an amazing tool
and I'm sure JetBrains can do so much more with complete control over the
whole IDE than with a plugin.
I've not been very impressed with Visual Studio 2015 so far. I particularly
don't like being forced to sign in to what should be an offline piece of
software. Also, be careful installing it on an existing system as it will mess
up your settings. I'm sticking with VS 12/13 and ReSharper for now but if
Rider looks good then I could be tempted to switch.
Android Studio is great and switching from Eclipse to IntelliJ was a good move
by Google. Being able to do cross-platform development with .NET on a first
class IDE is an alluring proposition. I doubt VS will ever be ported to Mac or
Linux despite CoreCLR as it relies heavily on WPF which is tied to the Windows
graphics system.
I assume Project Rider is just a code name. I wonder what the final product
will be called. IntelliSharp?
~~~
jbigelow76
>I particularly don't like being forced to sign in to what should be an offline piece of software.
With JetBrains moving to a subscription model for their software won't you end
up with the same problem?
~~~
jsingleton
I hope this isn't the case. The last time I purchased ReSharper it was a one
off payment. IIRC you got a year of upgrades included but if you wanted the
newer versions after that then you needed to keep paying a subscription
renewal fee. However, it wouldn't stop working after that time. You still kept
what you paid for.
I just had a quick glance at the ReSharper page [0] and it appears that they
still use the same model. They also offer monthly subscriptions but say that
"12 months of uninterrupted subscription payments qualify you for receiving a
perpetual fallback license.". BTW the cookie law notice on that page is one of
the best I've seen :).
I've used pretty much every version of Visual Studio and 2015 is the first to
have issues with licensing. I originally signed in with my MSDN account to
activate it (fair enough) and then signed out. Now it tells me that my license
has "gone stale" and I can't use it until I sign in again. There is an option
to enter a product key but MSDN tells me:
A product key is not offered with this edition of Visual Studio.
To unlock the product, you must sign in using the login associated
with your active Visual Studio subscription. By signing in, your
IDE settings will sync across devices, and you can connect to
online developer services.
My old versions of VS are still licensed with a key and work fine. It makes me
worry whether I will still be able to use VS 2015 when my subscription
expires. Maybe it's worth exploring the free community versions. I believe
that they no longer have the limitations on plugins that they use to have. I
don't mind paying for things but the time and cognitive overhead dwarfs the
financial cost.
[0]
[https://www.jetbrains.com/resharper/buy](https://www.jetbrains.com/resharper/buy)
~~~
chadgeidel
Huh, I thought you could use a product key with all versions. Learn something
every day I guess. I personally sign in, but I've seen the option.
------
erostrate
JetBrains products are very powerful, well designed, customizable, and overall
awesome in my experience (ReSharper and PyCharm). Looking forward to this! And
really hoping they will have a free edition, although I doubt it.
~~~
realharo
CLion isn't all that great. It still has major issues analyzing a lot of C++
code (false positive red squiggly lines everywhere), almost 1.5 years after
the initial annoncement.
Although C# should be a much easier language to deal with, so this one might
turn out better.
~~~
taspeotis
ReSharper C++ (why they didn't call it RePlusPlus I'll never know) is also
sorely lacking in basic functionality. Compared to opening a C# project with
R#, opening a C++ project with R#++ and none of the refactoring keyboard
shortcuts work ... Visual Studio just feels dead.
(I love my ReSharper Ultimate license, and R#++ helps to fill in a lot of
Visual Studio's blanks as far as C++ goes, but really it needs to spend
another six months incubating.)
~~~
kmch
You should try Visual Assist
([http://wholetomato.com/](http://wholetomato.com/)) in case you haven't heard
of it.
------
admnor
Finally, a proper C# IDE on not-Windows. I've been working with ASP.NET 5 on
Linux for the last year and Sublime, Atom, VSCode, etc. are all painful.
Sublime is surely dead, and anything based on Electron kills my 2-core CPU and
my battery. I was in the talk here at NDC London where they demo'd Rider, and
I can confirm that it is very fast and he showed the Mac process view and it
was barely hitting the CPU, even with a JVM and a Mono VM running. And when
it's all running on .NET Core it should be even faster.
Great days.
~~~
mariusmg
MonoDevelop is cross platform and decent.
"And when it's all running on .NET Core it should be even faster." The IDE is
written in Java and obviously runs on the JVM.
~~~
gorohoroh
Frontend is in Java (and Kotlin), indeed, as it's a part of the IntelliJ
platform. However, the backend that actually provides IDE features for C# is
written in .NET. The backend is actually the same ReSharper logic that runs in
Visual Studio but in headless mode.
~~~
winterbe
Wondering how much code in Intellij Ultimate is already written in Kotlin.
Parts of Kotlin Code in Intellij Community Edition is still kinda low:
[https://github.com/JetBrains/intellij-
community/search?l=kot...](https://github.com/JetBrains/intellij-
community/search?l=kotlin)
------
gravypod
Ever since I started reading the C# language docs I wanted to switch to it
from Java. Open sourcing some of the components of the language only pushed
that drive farther.
I'm glad to see that now a switch to C# is viable for me. I need a good IDE,
and nothing really comes close to the 'gold' standard, JetBrains products.
Bravo! Can't wait to use it.
~~~
FatalBaboon
Visual Studio is one of the best IDE out there, and it is likely to remain
better than Rider for at least a while.
It surely "comes close" ;)
~~~
alkonaut
VS is nice and powerful, but ironically it needs Resharper to match a
JetBrains IDE like IntelliJ for autocompletion/tooltips/navigation, which is
_severly_ lacking in Visual Studio. That's one objection. The other problem
with Visual Studio is that it's a beast. It takes multiple gigs of drive space
(which of course suddenly became a problem when everyone's disks shrunk from
1TB to 128GB over night a couple of years ago, at the same time VS grew to
around 15GB...). This is being worked on, but it will be a few more versions
until the installer is good, and it will never be as small as an IDE that
isn't carrying years of Windows legacy stuff. In short: VS is one of the best
IDE's out there, but it would be a lot better (At C#) if it contained
ReSharper's features and weighed in at 1/10 of its size. And that is exactly
what this looks like.
~~~
robwilliams
I've never used Resharper but Visual Studio's Intellisense is incredibly
powerful. What's lacking in its autocomplete support?
~~~
to3m
I'm not the same guy, but what I've found:
\- No substring completion (e.g., like emacs's ido-mode), no fuzzy completion
(e.g., like various JS-based editors), no acronym completion (e.g., like
emacs)
\- Navigation bar searches are shit. You have to select namespace, then class,
then symbol, and there seems to be no keyboard shortcut for any of this stuff.
It should present a list of entities in the file, display fully-qualified name
of each, and let one search that list using the search mechanisms
\- Class view searches aren't great. They solve many of the above two problems
(the full list of entity names is searched, and it finds by substring), but
the results don't update in real time
\- Very hard to find files. Say you have a massive project with loads of
solution folders and you need to find that file that's got the name
"ProductScreen" in it - well, good luck! There's something I've seen some
people do with the toolbar-based Find in Files widget, but that only searches
by prefix, which is useless, because so many projects have a mandatory prefix
on their file names
When I was a regular Visual Studio user - less Windows work of late means I've
mostly been using emacs - I used to use Visual Assist
([http://www.wholetomato.com/](http://www.wholetomato.com/)), an addin that
improves the above functionality a bit. Visual Assist's code completion is a
bit intrusive, but it does the acronym completion thing; for navigation, its
navigation bar replacement lets you search fully-qualified names by
substrings, its class view-style functionality updates the symbol list
(searched by substring) in real time, and you have something similar for
finding files as well.
All of this stuff is great for finding your way quickly around an unfamiliar
project - i.e., any project with more than 15 programmers, even after you've
been working on it for 2 years. And even when you know exactly what you want,
at least you don't have to keep typing in that stupid project-specific prefix
everywhere.
(I'm happy for the Visual Assist people that MS hasn't just copied their
functionality exactly and totally put them out of business in one go, but it
does make me a bit mistrustful of the Visual Studio UI team's judgement.)
~~~
henrikschroder
ctrl-comma, ctrl-comma, CTRL-COMMA!
It does all the searching you want for filenames and symbols, substring
searches, matching, you name it. It's fantastic, and they should highlight it
way, way more.
As for your file finding problem, I don't know why you didn't start typing in
the search field on top of the solution explorer. It filters everything in the
solution explorer, and also does substring matching.
~~~
to3m
Thanks, interesting. These both look helpful. Wonder if I've missed anything
else important.
------
dawkins
People praise IntelliJ it a lot, but my experience with Android Studio and its
slowness always reminds me that its writen in java.
I have been developing with Monodevelop the last 4 years when I switched to
mono/ubuntu and I don't see here nothing that it doesn't have already: cross
platform, excellent refactoring support, multiple targets. Maybe its because
of my particular use case but I even prefer it to Visual Studio.
~~~
yoz-y
I have had similar experience with Android Studio. I found very buggy and
confusing since it constantly switches contexts and (seemingly) randomly re-
shuffles the UI. (Building, debugging, running) I also found most of the
pictograms thoroughly unhelpful. Nevertheless IntelliJ has a very good
reputation so I suppose it is not the platform's fault.
~~~
V-2
Pictogram toolbars are an idiotic invention, but you can get rid of them
easily. Android Studio is highly customizable.
It's not perfectly stable, but out of my experience (a bit dated now I admit)
neither was Visual Studio. Eg. its infamous XAML designer is still haunting me
in my dreams.
As far as bread and butter of an IDE goes, meaning navigation, refactoring
etc. JetBrains' software, along with its derivatives (Android Studio is a
fork), is superior to VS imho.
Then again it's a bit of apples vs. oranges here, VS is more heavy-weight and
powerful, with its SQL integration and many many other features.
------
cromantin
Finally. It would be great if Unity3D add official plugin to this IDE. For now
Consulo[1] is competent alternative (build by some cool dude using IntelliJ
Community Edition).
1: [https://github.com/consulo](https://github.com/consulo)
~~~
rplnt
I believe Unity3D is working on some sort of common way for any ide/debugger
to be plugged in. With that and the deals they have with MS I don't see them
adding support for another IDE.
~~~
edgarjcfn
I do all my Unity3D development on a Mac with VSCode[1] and it's really
powerful. Includes step-debugging out-of-the box, and it's free.
[1][https://code.visualstudio.com/](https://code.visualstudio.com/)
------
chvid
A fully capable, crossplatform IDE for C# built in Java / running on the JDK.
And since it is by JetBrains it probably will be the best C#-IDE available.
Ironic but great news.
:-)
~~~
drewnoakes
The front end runs on the JVM. It interops with a C# executable in the
backend, hoisted from ReSharper.
------
SuddsMcDuff
I've been a .NET dev for over 10 years now, finally there's an alternative to
the huge, monolithic, incredibly bloated Visual Studio, which I've grown to
hate with a passion.
There have been contenders over the years, monodevelop, sharpdevelop, most
recently the OmniSharp plugin with atom.io. This is the first time I can
realistically see myself being free from VisualStudio in a professional
context.
~~~
alyx
Can you be more specific about what you hate?
I've been a .NET dev for over 10 years as well, and for all it's faults, VS is
one of the best dev environments I've ever worked in.
~~~
SuddsMcDuff
It's the incredible bloat of the thing. It's weighed down so much by features
that I do not want or need. I don't want SQL Server express & all the
associated bloat that comes with that. I don't want IIS express and all the
associated bloat that comes with that. I don't want freaking Entity Framework
and its associated GUI designers. I don't want bloody "Blend for Visual
Studio", I thought that crap was dead. I don't want TFS explorer - TFS is crap
and everyone knows it, including Microsoft. I don't need a WPF designer UI, I
don't need a WinForms designer UI. I certainly don't need a WebForms designer
UI (shudder). All I want is a text editor that lets me write & debug code.
~~~
interdrift
Altho I don't share your issues with all the 'bloating' stuff ( as others
mentioned most of them can be avoided ) , I have a growing concern for the
performance of VS while working with a bigger codebase (120+ projects) . It
gets really really slow and full of loading screens which literally block you
from doing anything. In fact you can sit there and wait 5+ minutes sometimes
just to have the IDE let you do anything. Not to mention the infinite amount
of times I've had Visual Studio close while I'm writing and it leaves my code
in an unknown state. Oh yeah, not to forget the debugger exception which
causes your debugger simply to detach even after you've spent around 5 minutes
to get to the precise location of the program. Whoa, turns out I have quite a
bit of hatred stored in me too! :D (VS2015)
~~~
s73v3r
I would have to ask why you have 120+ projects. That's likely your problem;
you're doing far too much within one solution.
~~~
lenkite
120 is not that high. If you have fine-granular modules for re-use across
multiple product lines, you need lots of projects. Yes, one can create several
solutions for each line, but many times one needs to load all the projects
especially feature teams that touch a little-bit everywhere.
------
Guillaume86
Nice to finally have some alternatives in the same "heavy weight" C# IDE
category.
Very curious about some aspects:
How do they deal with Nuget, do they have a powershell (Pash) console
integrated?
Do they plan to integrate Roslyn ("code fixes" for exemple, there would be
some overlapping in functionnaly with their code but roslyn will probably
become the "standard" to share these things)?
~~~
admnor
No, they don't plan to integrate Roslyn and they tend to get a bit cross when
people ask. They've been working on their C# code intelligence engine for 10
years in ReSharper, and it does stuff way beyond what Roslyn provides at
present. I love Roslyn, and I'm using it for code analysis stuff, but it's not
a contender for R# yet.
------
m_fayer
As a Xamarin developer, I'm really hoping that this plays well with Xamarin
and thus liberates me from having to deal with the buggy and invasive Windows
10. I'd much prefer to be on Mac but once you're used to Visual Studio,
Xamarin Studio is really a bitter pill to swallow.
Can't wait!
~~~
txdv
What is bad abotu Xamarin Studio?
~~~
JonathonW
It's gotten better (especially recently), but Xamarin Studio tends to be
pretty buggy-- I've seen basic project management tasks (like adding/moving
files) crash the IDE; I've had syntax highlighting just stop working for files
that compile fine; UI editors are slow, don't match the feature sets of the
equivalent native editors, and occasionally break backwards compatibility with
themselves (seriously; had the iOS UI editor wipe out some of its own
generated code and break my build a couple weeks ago). Again, it's nowhere
near as bad as it was about a year ago, but it's still the weak point of
Xamarin's toolset (which is, otherwise, pretty great, at least for what we're
doing with it).
------
merb
Funny I just written something to the C# IDE issue yesterday. And today they
announced the IDE.
God I'm happy. I'm using IntelliJ with Scala, Java, Python and sometimes
golang. I'm waiting for IntelliJ Rust and for the C# IDE. Promising languages
needs promising tooling. Currently this will help C# to spread around.
~~~
jinst8gmi
+1 for Rust. The Scala support was the big selling point for me with IntelliJ.
------
xiaoma
Does anyone here do C# work on a Mac outside of Unity? If so, what kind of
project is it?
~~~
pionar
Mobile work with Xamarin here. Using Xamarin Studio. Still, it's nowhere close
to the Windows experience with VS and the Xamarin tools there.
~~~
teebot
I can definitely relate to your experience. I've been a C# developer on
VS/Windows then moved to Xamarin/iOS dev on a Mac and it was ok but not
terrific.
------
sphildreth
I wonder how this compares to "Visual Studio Code" which is also cross-
platform and MIT licensed:
[https://code.visualstudio.com/](https://code.visualstudio.com/)
~~~
edgarjcfn
I wonder how is it that people don't know about this fantastic IDE and are
talking about Project Rider as if it's the first non-windows C# IDE.
~~~
ybx
Because Visual Studio Code is more of a code/text editor than it is an IDE. It
doesn't compare to regular VS with R#.
------
jinushaun
This is exciting news. Ironically, it's almost unfathomable to imagine writing
C# code in Visual Studios without having ReSharper installed. Glad to see
ReSharper finally available cross platform!
~~~
UnoriginalGuy
What? Resharper is barely useful in Visual Studio 2015, and the memory
leak/lag/slowness/issues are tiresome. If I hadn't spent so much on Resharper
I'd uninstall it.
~~~
topbanana
Works well for me
~~~
UnoriginalGuy
I have 16GB of RAM on my work machine and if I leave two instances of Visual
Studio running the machine saturates after half a day.
Plus even they admit to the slowness[0]. But regardless I bet a certain core
demographic of Resharper users are never going to admit to a problems
(slowness/lag/slow startup/RAM usage/memory leaks).
[0]
[https://confluence.jetbrains.com/pages/viewpage.action?pageI...](https://confluence.jetbrains.com/pages/viewpage.action?pageId=37228482)
~~~
pbz
If this new IDE (which uses the same Resharper codebase) ends up being a lot
faster than VS then we'll know where the bottleneck is.
~~~
Someone1234
Well an IDE with less features _is_ going to be faster than VS, that isn't a
question. Just see Sublime Text, insanely fast until you start adding
extensions and it gradually gets slower and slower.
But Resharper is an absolute hog. For as slow as Visual Studio can be, it
quadruples it in all categories. Just look in this thread, you can see dozens
of people saying the same thing: Resharper has massive performance problems.
~~~
alextgordon
> Well an IDE with less features is going to be faster than VS, that isn't a
> question. Just see Sublime Text, insanely fast until you start adding
> extensions and it gradually gets slower and slower.
In principle, only a small proportion of features need to be actively
consuming cycles. Most require no resources until invoked by the user.
Only a small proportion of _those_ features are actively involved with drawing
and so need to block the UI thread sometimes.
And only a small proportion of _those_ features perform some essential drawing
operation and so need to block the UI thread every frame.
So while a badly designed IDE with fewer features will outperform a badly
designed IDE with many features, both will be trounced by a well-designed IDE
with many features. It's the design that matters.
------
novaleaf
I love the features of Resharper, and every release I hope they solve the
"large project" performance problems, but time after time Resharper crashes my
IDE or makes intellisence unresponsive.
Maybe it's due to the visual studio plugin environment? I suppose trying Rider
out will show....
~~~
alkonaut
I have been using R# on a large (10k types, 100 projs) project for many years
without feeling any performance problems unless the files are thousands of
lines. I don't use the "full solution analysis" though. Unresponsiveness is
common for example when reloading projs after fetching new code from version
control, but I never attributed that to R#
Should add that this is a vanilla C# sln, there is no C++, no web projs etc.
Just C#. That could matter.
------
justncase80
Just because it seems some people aren't aware, Code from Microsoft is also a
pretty cool C# IDE. It's arguably way better than Visual Studio in a lot of
ways:
[https://code.visualstudio.com/](https://code.visualstudio.com/)
------
insulanian
Yay! I've been waiting for this for years. And if they would make an F# IDE,
that would be heaven.
------
jmartinpetersen
I guess this makes sense as a strategic move. Microsoft have beefed up VS 2015
with "Roslyn technology", which makes Resharper more of a commodity (albeit in
a class of its own). This move levels the field a bit by opening up other
flanks.
------
jinst8gmi
Is there any point to Visual Studio Code (not visual studio proper) once this
goes live?
~~~
willu
VS Code is free and open source, so I'd say yes.
------
ioab
This is really awesome news. I've been looking to write C# on Mac and none of
the existing solutions is satisfying enough. Hopefully "Rider" will be the
best companion for .NET on platforms other than Windows.
------
SonicSoul
great timing.
I feel like regular windows VS has been bogged down with so many features +
ReSharper that it's getting to be a challenge to keep it responsive. I've
already disabled a lot of nice features and extensions to be able to type
quickly. It's the worst feeling when intellisence freezes typing in c# for sub
second or re-writes correctly written javascript or keep hanging when doing
any sort of web code. I am hoping this will be a hybrid of light weight editor
such as VS Code with heavier refactoring features of ReSharper therefore
powerful yet very responsive. Can't wait to try it out
------
fithisux
It is a pitty they do not have a Dlang IDE, an deprecate C++ options.
------
bad_user
Is there an EAP available?
~~~
nalllar
> We’re aiming to open a private EAP in the coming weeks, towards the end of
> February. We’ll announce the signup form here on the blog, as well as on
> Twitter.
------
V-2
And thus SharpDevelop was killed (if it was ever much alive...)
------
diezge
Any chance of a Rust IDE in the future? _crosses fingers_
------
0xFFC
A full Go ide would have been better in my experience.
~~~
thegenius2000
You should also consider vim (w. the vim-go plugin) if you're on Unix.
It's well-written, and along with several other vim plugins and a decent
configuration, it's miles ahead of any IDE (except emacs, not starting an
editor war here).
Read here, [http://farazdagi.com/blog/2015/vim-as-golang-
ide/](http://farazdagi.com/blog/2015/vim-as-golang-ide/)
~~~
jen20
I have found that vim-go is effectively unusable if you are editing files over
a few hundred lines long or on large projects. The time between :w and
regaining control of the editor is just too long. Much of this is probably due
to the execution time of the compiler for Syntactic checking, and has extended
significantly in Go 1.5.
Colleagues tell me that neovim alleviates some of this, but during the time
before that was stable I switched to emacs (specifically Spacemacs) with evil
mode instead.
------
alkonaut
Are there any clauses in the Roslyn license that prevents third parties for
using it for writing code that directly competes with Microsofts own products
such as Visual Studio?
~~~
fgtx
they have their own compiler. See:
[https://blog.jetbrains.com/dotnet/2014/04/10/resharper-
and-r...](https://blog.jetbrains.com/dotnet/2014/04/10/resharper-and-roslyn-
qa/)
> Will ReSharper take advantage of Roslyn? The short answer to this
> tremendously popular question is, no, ReSharper will not use Roslyn. There
> are at least two major reasons behind this.
The first reason is the effort it would take, in terms of rewriting, testing
and stabilizing. We’ve been developing and evolving ReSharper for 10 years,
and we have a very successful platform for implementing our inspections and
refactorings. In many ways, Roslyn is very similar to the model we already
have for ReSharper: we build abstract syntax trees of the code and create a
semantic model for type resolution which we use to implement the many
inspections and refactorings. Replacing that much code would take an enormous
amount of time, and risk destabilizing currently working code. We’d rather
concentrate on the functionality we want to add or optimize, rather than spend
the next release cycle reimplementing what we’ve already got working.
The second reason is architectural. Many things that ReSharper does cannot be
supported with Roslyn, as they’re too dependent on concepts in our own code
model. Examples of these features include Solution-Wide Error Analysis, code
inspections requiring fast lookup of inheritors, and code inspections that
require having the “big picture” such as finding unused public classes. In
cases where Roslyn does provide suitable core APIs, they don’t provide the
benefit of having years of optimization behind them: say, finding all derived
types of a given type in Roslyn implies enumerating through all classes and
checking whether each of them is derived. On the ReSharper side, this
functionality belongs to the core and is highly optimized.
The code model underlying ReSharper features is conceptually different from
Roslyn’s code model. This is highlighted by drastically different approaches
to processing and updating syntax trees. In contrast to ReSharper, Roslyn
syntax trees are immutable, meaning that a new tree is built for every change.
Another core difference is that Roslyn covers exactly two languages, C# and
VB.NET, whereas ReSharper architecture is multilingual, supporting cross-
language references and non-trivial language mixtures such as Razor. Moreover,
ReSharper provides an internal feature framework that streamlines consistent
feature coverage for each new supported language. This is something that
Roslyn doesn’t have by definition.
~~~
mariusmg
They have a parser , not a entire compiler
------
tobz
How long is it before JetBrains comes out with something more like Eclipse or
Atom?
You have IntelliJ, PyCharm, RubyMine, CLion, and now this. That's excluding
plugins for other IDEs. All of their IDEs basically look and feel the same.
It seems like a ripe opportunity to make a single, extensible IDE that isn't
quite so ugly as Eclipse but isn't quite so barebones as Atom that then just
has different chunks of support for specific languages, potentially with
individual licensing, etc.
Maybe that doesn't further their business goals but it sure would be nice to
get the IntelliJ treatment whether I'm writing Java, Ruby, Python, C#, C++,
etc.
~~~
winterbe
It already exists and is called the Intellij Platform. It's open source and
the base for all Jetbrains IDEs, see:
* [http://www.jetbrains.org/pages/viewpage.action?pageId=983889](http://www.jetbrains.org/pages/viewpage.action?pageId=983889) * [https://github.com/JetBrains/intellij-community/tree/master/...](https://github.com/JetBrains/intellij-community/tree/master/platform)
~~~
noir_lord
Since 15 it also works damn near flawlessly for me on Ultimate, they also seem
to be faster on getting the plugins upto date with the individual IDE's which
is a major win for me.
Current project I'm using PHP and Python for different parts and it is
absolutely flawless, also if anyone hasn't used them Facets and Aspects are
incredible!
| {
"pile_set_name": "HackerNews"
} |
The Perils of Peak Attention - benbreen
https://newrepublic.com/article/137107/perils-peak-attention
======
sotojuan
I've been experimenting with using my phone as little as possible. When I'm
out, I put it in my backpack or pocket on silent. At the very most, I'll use
it for music. I try really hard to not pull it out to check HN or others when
I'm bored. I also now check Twitter at most twice a week, for less than twenty
minutes.
It's too early to be confident in the results, but I do feel more "peaceful"
and able to focus, if that makes sense. I'm trying to extend this to my
computer and not get distracted by opening up new tabs every time I get a
break from thinking or coding.
For those interested, the inspiration for this experiment was Cal Newport's
book "Deep Work".
------
apatters
The chief concern I have about being "submerged under the waves" of the 'net,
as this article puts it, is that we will lose the lone creative genius.
Hard to have a lot of truly original thoughts when you spend every waking
moment chatting with someone, instagramming everything you experience etc.
Design by committee is frequently inferior to having one expert with a strong
vision sit down in relative isolation and create. But it's getting harder and
harder to be isolated. We're raising an entire generation that doesn't even
understand the virtue of that concept.
------
zerognowl
In a world where your Rushkoffs[1] go unread, we are spiraling into some sort
of local maxima where social media is realized for what it is, and the
problems associated with social media are epidemic.
The crux of the issue lies in the fact that nobody knows what social media is,
or indeed cyber. "Cyber" as it stands now is some far off place, in a William
Gibson fantasy, but infact operates in the world seemingly un-noticed by the
smartphone equipped masses.
As I said; it's not long until people realize they've been played and their
eyeball hours and data exhausts are being sold to the highest bidder for hard
cash. It makes me wonder why smartphones even cost so much. Surely they should
be 'free' given how much data can be gleaned from a smartphone owner?
[1]
[https://en.wikipedia.org/wiki/Douglas_Rushkoff](https://en.wikipedia.org/wiki/Douglas_Rushkoff)
------
gaus
Great read. I often compensate in life by turning up the intensity, finding
new things to interest me, etc. So, it's refreshing to hear a justification
for learning to enjoy moments of thoughtlessness and solitude when they come
about.
~~~
roflmyeggo
Agreed. It's refreshing to read a justification to learn to enjoy such
moments.
This is especially true in a society that tends to only place value on
thoughts that directly lead to tangible means of moving forward. Sometimes,
however, these moments of thoughtlessness can complement the aforementioned
moments of intensity. This combination can actually lead to an overall net
gain in achieving ones goals, whatever they may be.
A tough concept for the prototypical Type A personality to grapple with, no
doubt. But something that is worth pondering.
| {
"pile_set_name": "HackerNews"
} |
MixPanel Platform - pelle
http://mixpanel.com/api/docs/guides/platform
======
sahillavingia
What a brilliant idea. What do people think of this UI switch-up (from an
ameteur designer trying to get better)?
[http://f.cl.ly/items/88b19cf0a0964af0e53d/Screen%20shot%2020...](http://f.cl.ly/items/88b19cf0a0964af0e53d/Screen%20shot%202010-10-14%20at%205.16.53%20PM.png)
~~~
immad
Agreed. The current funnel is hard to read in the summary page
------
robfitz
Beautiful. The end-user-facing charts would have saved me lots of hours...
~~~
trefn
More charts: <http://mixpanel.com/api/docs/guides/platform-ui>
------
pyronicide
I've been using mixpanel's platform for a couple weeks now and have been super
pleased with it overall. Being able to deliver stats to my end users has been
a big success.
~~~
kqueue
what's up? :P
------
forcer
In comparison with Clicky the pricing is very expensive to track actions on
high traffic websites. Probably good for startups in subscription space that
have lower traffic / high revenue per user though.
------
powdahound
Good stuff! I need to find time to make the switch to MixPanel.
------
Vee
i just got MixPanel patched into analytical
<http://github.com/jkrall/analytical>
------
judegomila
Great idea.
| {
"pile_set_name": "HackerNews"
} |
The Acquisition King: Reckitt Benckiser - escapist16
https://aalokbhatawadekar.in/2020/07/08/reckitt-benckiser/
======
escapist16
Reckitt Benckinser or RB as we know it today owes a great amount of its growth
to the outstanding acquisitions that it had done in the past. The author has
listed many such acquisitions and the resulting product line that has helped
RB enhance its portfolio of products.
| {
"pile_set_name": "HackerNews"
} |
Why Spotify Will Kill iTunes - jjhageman
http://www.businessweek.com/management/why-spotify-will-kill-itunes-07222011.html
======
kylec
I think his argument is pretty weak. Sure, for tech-savvy people that buy a
lot of music, going with the flat pricing of Spotify makes sense, as does the
convenience of not having to manage and sync your music files between devices.
However, a very large number of people spend less than $10/month on music, or
don't have the desire to completely switch over to a new music service.
There's also the hesitation to spend the effort switching to the new service
that will disappear once you stop paying them whatever they demand. Netflix
has recently demonstrated that the pricing for media streaming has the ability
to change rapidly and sharply, leaving you no recourse other than to either
pay or lose access to the music and your playlists.
------
tzs
No Beatles.
No Bob Dylan.
No Pink Floyd.
I'm certainly enjoying Spotify, but it has some big gaps that need to be
closed.
~~~
pinko
Wow, really? MOG has Dylan and Floyd (but obviously no Beatles). I assumed
their catalogs (and anyone's other than Apple's) would be pretty similar.
What explains the very different licensing these different streaming music
apps are able to get? Is it just their negotiating (in)ability?
------
gallerytungsten
The article seemed a bit tedious and overwrought. All that aside, for me it
raised the question, "how do the musicians get paid?"
As a musician, this is a matter of great interest. The answer: "Spotify shares
it 50/50" according to the first article I found that answered the question.
Last time I checked, Apple was paying 70%; however, if you're not on a major
label, you'll likely get to iTunes through a third party such as CDbaby; they
pay 91% of that 70%. Even so, 63.7% is better than 50%.
Addendum: as the article I mentioned describes, it may actually be worse than
that, because payments aren't on a track basis, but rather more like the
airplay payments that BMI and ASCAP pay (in which you only get paid if you
have lot of spins).
[http://gramtone.wordpress.com/2009/03/20/spotify-a-loss-
for-...](http://gramtone.wordpress.com/2009/03/20/spotify-a-loss-for-
musicians/)
~~~
lurch_mojoff
How much do musicians get paid for Spotify streaming? Well, this much:
[http://www.informationisbeautiful.net/2010/how-much-do-
music...](http://www.informationisbeautiful.net/2010/how-much-do-music-
artists-earn-online/)
------
shoota
If that is the case why hasn't rhapsody which has been around for a long
period of time already killed iTunes?
------
bentruyman
Sort of off-topic, but does anyone know why the hype around Spotify is so
great?
A few years ago, I started using Napster's streaming service. Then moved to
Rhapsody for a slightly better interface/collection and mobile offline
caching. Then to MOG, and eventually Rdio.
Now Spotify comes out, and while I also moved to it, about 30 of my
coworkers/friends are now on it using it as their first streaming service.
Maybe the invite-only access and alleged legal troubles helped in their favor?
I don't get it.
~~~
programminggeek
Spotify has slowly been building buzz in Europe for a long time and they also
did a huge PR push all at once.
Also, they are artificially limiting supply of invites to make more people
sign up for the paid service. Scarcity creates value. People want what they
can't have.
The other thing is unlike Mog or Rdio, since Spotify came to the US after
years of development, it already has loads of polish and solid mobile apps.
AKA, their v1 product is actually more of a v2 or v3, so when people use it,
it feels more impressive.
------
wccrawford
Hogwash. iTunes isn't just a place to listen to music. It's a place to buy it,
retrieve it, find it, and more. It has better selection and is backed by a
bigger company, with more weight to throw around when record companies try to
get pushy.
------
roc
iTunes is used, overwhelmingly, to deliver music _to mobile devices_. And
streaming is facing a number of serious barriers for that kind of mobile use:
battery drain, network coverage, bandwidth caps and the high price of data
plans (with sufficient caps).
So, almost regardless of individual quality, streaming services are not
replacing iTunes any time soon. Almost certainly not in the US and likely not
anywhere else.
They definitely have a market, particularly in competing with terrestrial and
satellite radio. But streaming is simply not a good fit to do what iTunes is
used to do.
~~~
Hyena
Spotify lets you download songs into iPods and phones for offline streaming.
This is an obsolete complaint.
~~~
lurch_mojoff
"Offline streaming", right. You can synch only music you have bought from
Spotify. It does not work with the "all you can eat" subscription. So it is
very much not an obsolete complaint.
~~~
Hyena
"Offline streaming" is kind of a kludge term. To me, if I don't own the songs
and they're not immediately accessible to my other software, "syncing" seems
slightly inaccurate connotatively. So I went with "offline streaming" because
it seems to connote what is going on: localizing my stream, very local.
I haven't had a problem with the Premium service, so I'm not sure which one
you're describing as AYCE.
------
joejohnson
I don't know if I believe that Spotify will really be able to kill iTunes, but
perhaps it will force iTunes to lower their prices or to offer streaming
subscriptions?
------
aberkowitz
Apple can enter the streaming music market whenever they want to; that's why
they bought Lala.
------
programminggeek
iTunes isn't going anywhere because it's tied to iPod, iPhone, Apple TV, iPad,
Mac and it's where you buy Music, Videos, Books, Apps, Audiobooks, Podcasts,
and um....yeah.
Also, I love music and Spotify is pretty great, but so was Mog, Pandora,
Napster, Zune, and many other music services, but none of them have dented
iTunes, if anything quite the opposite.
Spotify is great relatively cheap way to sample lots of music you may or may
not like, but if I want to keep a song or album, I'm going to buy it on iTunes
or Amazon MP3 so I have a backup copy of that music "forever".
Spotify is likely a bigger competitor to Pandora and traditional radio than it
is to buying music in general. People like owning things more than they like
renting things.
Hoarding is human nature.
~~~
Hyena
That's probably why Spotify looks almost precisely like iTunes and syncs to
un-networked mobile devices.
| {
"pile_set_name": "HackerNews"
} |
Australia's Home Affairs department proposes making telcos retain MAC addresses - pdemporg
https://www.zdnet.com/article/home-affairs-floats-making-telcos-retain-mac-addresses-and-port-numbers/
======
pdemporg
The fact that most MAC addresses are not visible to the ISP behind NAT anyway
is not the point, given the Department is not renowned for doing its homework.
The real concern here is the insight provided by this submission into the sort
of thinking that is clearly going on at the most powerful ministerial
department in the country. These people don't understand the technology, and
fundamentally do not care.
------
flukus
We should be cycling MAC addresses regularly anyway since they leak
information: [https://www.crc.id.au/tracking-people-via-wifi-even-when-
not...](https://www.crc.id.au/tracking-people-via-wifi-even-when-not-
connected/)
------
auslander
Thats why i'm using my own router (x86 miniPC with OPNsense) on NBN instead of
ISP provided one. ISP provided routers are under their full control, you don't
have super user password. That and VPN enabled all time on all my devices.
| {
"pile_set_name": "HackerNews"
} |
StumbleUpon’s Su.pr is Really... Super - muimui
http://www.brentcsutoras.com/2009/06/09/stumbleupons-supr-super/
======
noodle
too bad that its going to probably bury the recently released
<http://peashootapp.com/> which is pretty similar.
~~~
pantsd
except that costs money
| {
"pile_set_name": "HackerNews"
} |
Six Days in North Korea - tswartz
http://www.nytimes.com/interactive/2015/06/10/world/asia/north-korea-photos-video.html?_r=0
======
Zirro
For those who would like to follow North Korea on a more than casual basis,
these sites are likely to be of interest:
* NK News ([http://nknews.org](http://nknews.org))
* DailyNK ([https://www.dailynk.com/english/index.php](https://www.dailynk.com/english/index.php))
And perhaps especially since this is HN:
* North Korea Tech ([https://www.northkoreatech.org](https://www.northkoreatech.org))
~~~
dominotw
[https://www.facebook.com/dprk360](https://www.facebook.com/dprk360)
~~~
yakshemash
[http://38north.org/](http://38north.org/)
------
colinbartlett
NY Times team did a great job of making this responsive. It works really well
on mobile.
That said, this is best viewed on desktop on a huge monitor. It looks
gorgeous. Really, beautiful job to their front end team, consistently
delivering this kind of experience.
~~~
adolfojp
And yet the video was awful in every way. It cut too fast, not allowing us to
fully appreciate what was going on, and it had no playback controls so if you
missed anything you have to wait for the entire thing to loop back up. Did you
see the blue box that trapped the foot? I don't know what that was.
~~~
ubersync
It covers your shoe with a polythene shoe cover. Mostly used in hospitals.
------
atemerev
You do realize that everything in these pictures is thoroughly and carefully
pre-selected and even engineered for display purposes, right?
(I was born in the Soviet Union. It is mostly unknown for Western people how
much propaganda matters in totalitarian countries, and how little it has to do
with reality).
~~~
EliRivers
Having been there on holiday myself, the only way to ensure that everything is
pre-selected and engineered would be to effectively shut down the entire city
(indeed, every city and town I went to), and create a kind of bubble around
each of our buses (and, when we switched to foot, around each pocket of
walking tourists); fake cities filled with actors, all standing and doing
nothing until someone shouts "The bus is almost visible" at which point they
break into action like a pack of extras on a Hollywood set, pretending to be
walking somewhere as the bus passes, only to stop as soon as it's out of
sight. Everyone's life on permanent hold, just waiting to show the tourists
what it looks like when people walk on the pavement. What glory for the
Fatherland that is!
To say that "everything in these pictures is thoroughly and carefully pre-
selected and even engineered" is simply not true.
~~~
drzaiusapelord
>effectively shut down the entire city
So.. you guys were carefully corralled in the city? N Korea is mostly rural,
with most of its population barely able to get enough calories to eat.
Pyongyang is a propagandist show-room and is full of connected party types.
Your average N Korean doesn't live that lifestyle. This is all fairly well
documented by various sources, mostly via the oral histories of defectors:
[http://www.amazon.com/Nothing-Envy-Ordinary-Lives-
North/dp/0...](http://www.amazon.com/Nothing-Envy-Ordinary-Lives-
North/dp/0385523912)
[http://www.amazon.com/Escape-Camp-14-Remarkable-
Odyssey/dp/0...](http://www.amazon.com/Escape-Camp-14-Remarkable-
Odyssey/dp/0143122916)
If you care to discredit these works, I will argue with you on a point to
point basis. I have read both and other sources that describe the regime and
everyday life in North Korea. Perhaps we can also get Dennis Rodman in here
too but clearly you both are on the same page.
Sadly, I find North Korea cheerleading often to be a biased political position
tied to the usual pro-China HN crowd that finds fault with everything the
US/EU do, thus anything the US/EU are critical of, must be good and any
criticism of these regimes must be "western propaganda."
~~~
EliRivers
_So.. you guys were carefully corralled in the city?_
No. We travelled through rural regions as well; small towns and so forth. I
saw (but did not go into) numerous very small villages. Those were not much
more than a collection of buildings, generally with a dirt track leading to it
from the road we were on.
In the city, whilst walking with our two guides, the distance between the
front walker and the rear walker was on the order of a hundred metres at
times. When I went there, it was a significant national holiday. Some of the
exhibition centres we went to had literally thousands of people flowing
through them while we were there. We were in the crowds, mixing in with the
people attending. Huge flows of people. At a military parade, we stepped off
the bus and stood with the crowd to watch. On the underground, we stepped onto
the carriages with the locals who were waiting for the train. A funfair in
Pyongyang; huge, huge queues outside and inside. Enormous numbers of people
milling about and again, our group wandered through them.
I saw numerous signs of deprivation and malnutrition. I saw extensive evidence
of a massive solid fuel shortage in recent years. I saw a lot of things that
made it clear that the nation is, in many many ways, truly grotesque. I also
saw an upright stuffed alligator on wheels holding a drinks tray that David
Koresh of Waco fame had sent as a gift.
I would hazard (but of course do not know) that I know more about the subject
than you do (both before and after travelling there I read an enormous amount
of the available literature and various collections of conference
proceedings), and I have the additional evidence of my own eyes.
The western media LOVES to spin stories about the DPRK, and the western public
LOVES to read them. There is a lot - a LOT - wrong with the DPRK, but the
media seem to be able to write pretty much anything they like and people want
to believe it. You know what, go there and see it for yourself. Run in the
Pyongyang marathon. Do a cycling tour.
~~~
drzaiusapelord
>I would hazard (but of course do not know) that I know more about the subject
than you do
I have read two books from defectors, who I imagine had more personal
involvement and first-hand experience than some western tourist who fancies
himself an expert because he went cycling there once.
Tourism isn't the wonderful learning experience you make it out to be. Its, by
its nature, 100% artificial. You have no skin in this game. You can walk out
at any time and your money is coveted so you get treated and handled a certain
way compared to a local. Tourism doesn't teach you jack shit about the country
you're visiting and I say this as someone with two passports who visits his
parent's native country frequently. I am not an expert on that country even
though speak the language, have family there, etc. I just dont have skin in
the game and when I'm there I'm treated like a tourist - not a local. I dont
ever have to interview for a job there, deal with the authorities really, pay
taxes, deal with the local mafioso/crooked cops/bad government past a
superficial level, deal with any of the everyday grind outside of tourist
framework, serve in their military, etc.
I wish more North Korean apologists understood this. Instead we have an online
army of Dennis Rodmans trivializing the human rights catastrophe in North
Korea because the food they were served was okay and the locals all smiled as
you drove past in your air conditioned bus.
~~~
crdoconnor
I'd be skeptical of some of the more outlandish defector testimonies. They
have a strong incentive to exaggerate and invent stories, and owing to the
relative seclusion of North Korea, it's pretty easy to not get called on it.
Many of the defector testimonies also comes from the mid 90s, when North Korea
was undergoing a famine, and while that famine was _really_ severe, it was
still 20 years ago and their malnutrition levels have plunged precipitously
since. If it were still that bad today, probably the regime would have
collapsed already.
Tourism in the country is very controlled, but it's not 100%. Kernels of truth
will slip through. Honestly, the propaganda and the truth is quite easy to
distinguish, too - their propaganda is very unsophisticated and pretty blunt
and not every tour guide is a mindless brainwashed automaton.
~~~
drzaiusapelord
>They have a strong incentive to exaggerate and invent stories, and owing to
the relative seclusion of North Korea, it's pretty easy to not get called on
it.
So anything that contradicts North Korea state propaganda must be untrue? Oh,
ok.
~~~
crdoconnor
You know, you could just swap the phrase "North Korea state propaganda" with
"American impeeeerialists" (you must draw out the e) and you'd sound exactly
of like one of them.
------
gambiting
I was always fascinated by cars in North Korea - what brands do they use? How
do they get there? What do you have to do to drive one? Exactly how uncommon
are they? Who fixes them, and how do they know how? If the supreme leader is
driven in a Mercedes, how do they get the parts for it?
I come from Poland where during communist times cars were really hard to come
by, you had to wait 10 years on an official list to be allowed to buy one, but
people always managed to get by somehow. Most models were of course of Soviet
production, but an odd western car would sometimes appear, imported through
friends of friends of someone in the party. Obviously NK is completely
different, but I still find it very interesting, and it's hard to find any
information about this anywhere.
~~~
Symbiote
An article from 2014 suggests the government use at least one Mercedes, and
probably bought it via Russia or China. I'd guess they just drive it over the
border...
Maybe parts don't count as "luxury vehicles", in which case they can buy them
from Europe. Alternatively, China/Russia.
[http://www.telegraph.co.uk/news/worldnews/asia/northkorea/11...](http://www.telegraph.co.uk/news/worldnews/asia/northkorea/11013338/Mercedes-
limousines-spotted-in-North-Korea-could-breach-UN-sanctions.html)
~~~
ams6110
China produces compatible parts for almost every car, including Mercedes.
------
coldcode
The empty freeway from the window was the killer shot for me. Imagine if you
showed the average citizen there a picture of LA or even Beijing during rush
hour.
~~~
tswartz
Yes, its so interesting to see those types of pictures. I'd be fascinated to
hear what North Koreans think of the freeway system in their country since few
people use it.
------
jeffzilla
Hey, I've been there!
I ran my fastest marathon ever while being tailed by the paddy wagon, suited
up and bowed down before the embalmed body of Kim Jong Il, locked and loaded
to shoot at chickens, and tripped out in psychedelic halls of mirrors in the
DPRK - North Korea.
Here's how it all went down...
[http://jeffreydonenfeld.com/blog/2015/06/exploring-north-
kor...](http://jeffreydonenfeld.com/blog/2015/06/exploring-north-korea-and-
running-the-pyongyang-marathon/)
------
mrcactu5
I am impressed that Pyongyang appears in Google maps
[https://www.google.com.pr/maps/place/Ryugyong+Hotel,+Pyongya...](https://www.google.com.pr/maps/place/Ryugyong+Hotel,+Pyongyang/@39.0338528,125.6923056,13z/data=!4m2!3m1!1s0x357e1d28b1b75405:0x283ff682d04744d6)
I was looking for the giant Ryuyong Hotel in the center of town.
~~~
yitchelle
Clicking on some of the landmarks around the hotel, they have some Google
reviews. The reviews for the Pyongyang Arena makes interesting reading.
[https://www.google.com.pr/search?q=Pyongyang+Arena,+Pyongyan...](https://www.google.com.pr/search?q=Pyongyang+Arena,+Pyongyang,+Corea+del+Norte&ludocid=9068691185340548019#lrd=0x357e1d31fa7eca45:0x7dda7695c7f083b3,1)
| {
"pile_set_name": "HackerNews"
} |
Ask HN: Making money during college? - quizbiz
What did you do to bring in some income during college? I plan to take studies seriously but I also need to bring in some cash.
======
matttah
If you have HTML/CSS and basic website skills you can get a fair amount of
money from Craigslist very easily. Additionaly, see if your university web
department needs a webmaster, generally they allow you to work from home and
the pay is higher than other jobs on campus due to the need for technical
skills.
------
patio11
Campus work study when I was still a student, then got hired on by a pair of
professors to assist with their research after graduation. It was a wonderful
experience -- after the summer they wanted me to come be a grad student in
their lab for $20k a year, I sort of had other aspirations in life, we parted
ways amicably.
Failing that there is always retail/food service/etc, but working on campus
has a lot to recommend it over these other options. It pays better, is more
convenient, has more potential to result in something with resume power, etc.
~~~
TimMontague
Having a job doing research on campus gives excellent experience and looks
great on a resume. It also helps when you need a professor to write a letter
of recommendation if they know you outside of the classroom environment.
Assuming you are going to a research university, there should be no shortage
of opportunities. Your school website should have a list of professors along
with what areas they are researching, pick something that sounds like fun and
go an ask.
------
mnemonik
Right now I am employed by University Residences at Western Washington
University building web apps while I work towards my degree. UR is only one of
many departments that has a need for tech staff, I'm sure there are tons more
job openings than you realize. I was offered a job at the Financial Aid office
after I made a simple scraper that took their student job openings and made a
UI with better usability. This is an example of the Do What You Love (and
money will follow later) idea that is spread around here so often.
My advice is to have fun hacking away at as many little projects as possible.
Maybe one will get someone else's attention who is in a position to help you
out (like what happened with me, but I already am employed by the university).
Or maybe you can figure out how to capitalize on one of your little projects.
If all else fails you're always building your resume for when you apply
around.
------
jobeirne
TA work: $9/hr for menial labor. Can get you ins with professors for research.
Research: degree of pay varies widely. Our math department offers $12k for a
year's worth of undergraduate research. I'd assume yours has something
similar.
Private sector: Last I was getting $18/hr for JSP work, but I found the Wash.
DC IT scene to be relatively soul-crushing. Also risks interfering with your
studies because in most cases it drags you off campus.
If you're really hungry for cash, sell some stuff from home on Amazon.
------
jjguy
I got a job with the hospital on campus, doing custom win32 app development.
Nothing huge, just business process facilitation with some windows-native glue
apps.
Under market consulting rates, above typical college rates, I walked to the
office and set my own hours. Everyone wins. It was hard to beat.
------
hbien
Work study job at the school's help desk. Best job ever - tax free and a lot
of free time during the job to finish projects.
Also, my dorm room happened to be right next to the building I worked in.
------
seanharper
Supervise a computer lab, that was the best gig ever, i made $12 / hour (8
years ago) and could do homework and write code while I was working. It was a
lot of fun too.
------
il
Check out affiliate marketing, if you're any good at coding, you can make a
lot of money(4-5 figures a month) doing affiliate marketing PPC/SEO part time.
------
carterschonwald
there tend to be interesting jobs if you look long enough. Eg I'm doing some
computational game theory for a Poli Sci prof part time over the summer and
perhaps beyond. Fun stuff
------
Femur
Donate Plasma or Sperm.
------
tjpick
tutoring in the computer lab for 159.101 and 102
------
bryanalves
Poker is an easy way to make a lot of money
~~~
varaon
Poker is an easy way to lose a lot of money if you're not careful
| {
"pile_set_name": "HackerNews"
} |
YIMBY Hackathon - sutro
https://yimbyaction.org/hackathon/
======
livingparadox
More information on the group hosting the hackathon:
[https://yimbyaction.org/about/](https://yimbyaction.org/about/)
It looks like its intended to work on anti-NIMBY (Not In My BackYard)
solutions; So YIMBY stands for "Yes In My Backyard"
| {
"pile_set_name": "HackerNews"
} |
Challenges and Trade-offs in Building a Web-scale Real-time Analytics System - benblack
http://perspectives.mvdirona.com/2011/02/11/ChallengesAndTradeoffsInBuildingAWebscaleRealtimeAnalyticsSystem.aspx
======
barista
Impressive. We were building a high dimensional cube just last year for some
click stream analysis. Wish I had done some research in the literature at that
time.
| {
"pile_set_name": "HackerNews"
} |
Overseas outsourcing and intellectual property - andrewtbham
how do you cover your liability if you use overseas sub-contractors with regard to non-competes, intellectual property, ndas etc. especially if they are not in the US. How can you enforce them? I am going to talk with a lawyer, but wondered if anyone had any advice.
======
sfgary1
You cannot,IMO. Even if you outsource to a sub-contractor in a country with a
strong IP regime it is probably not worth your time,money and effort to pursue
infringement. The best way to protect your IP with sub-contractors is to not
give them access to your IP.
~~~
andrewtbham
thanks for the feedback... are you suggesting i give them access to some
source code but not the complete code? or just don't use people overseas at
all.
------
paulsingh
In general, stick to a service like odesk or elance -- they both have built-in
NDA/IP stuff that assigns everything to you.
| {
"pile_set_name": "HackerNews"
} |
Fracking in North America could be partly to blame for methane spike – study - colinprince
https://www.newsweek.com/fracking-u-s-canada-worldwide-atmospheric-methane-spike-1454205
======
btilly
[https://www.wired.com/story/atmospheric-methane-levels-
are-g...](https://www.wired.com/story/atmospheric-methane-levels-are-going-up-
and-no-one-knows-why/) is a somewhat older but a much more informative and
balanced article.
The td;lr version is that there is a minority that blames fracking, but most
have concluded that it doesn't fit the data. There are a bunch of other
theories as well. And the isotopes are not definitive. The fact that someone
recently wrote an article concluding otherwise doesn't actually change this
since neither data nor the line of argument are particularly new.
------
peter303
The irony of natural gas is that while it emits half the carbon of coal or
gasoline per same amount of energy release, methane is 20 times amore potent
greenhouse gas than carbon dioxide. So a 3% leakage pretty much erases its
green house savings. No fuel company wants to leak and waste that much
methane, but stopping leakage is not easy.
One of the few facts the President got right is the US has reduced carbon
emission more than any other country be converting from coal electricity to
natural gas electricity. That is mainly because the US was highest or second
highest carbon emitter for decades with any fractional decrease having impact.
The decrease has pretty much stopped withcoal plant closures slowing.
Environmentalist se natural gas as a transition phase to full renewables.
~~~
masklinn
> methane is 20 times amore potent greenhouse gas than carbon dioxide
The 100-year GWP of methane is ~30, but only because it has such a short
atmospheric lifetime (~12 years).
Over shorter terms its way worse, the 20-years GWP is >80.
~~~
Retric
The other way of looking at this is how much Methane is removed from the
atmosphere every year vs how much is added. In that context, many continuing
sources of emissions have negligible net impact year to year.
This is why Methane concentrations where basically flat from 1998 to 2008.
[https://en.m.wikipedia.org/wiki/Atmospheric_methane](https://en.m.wikipedia.org/wiki/Atmospheric_methane)
Methane does however turn into CO2, but represents a small percentage of
overall CO2 emissions.
~~~
aoeusnth1
No, that doesn’t seem like a useful way to look at it.
The old methane break down into CO2 regardless of our current emissions. Our
choices now affect future emissions of methane, the effects of which are
essentially independent of the past methane release.
Finding a consistent yardstick (methane = 30x CO2) seems better than
pretending it’s not a problem just because there’s some delayed negative
feedback loops already in play.
~~~
Retric
The issue is we already experience climate based on the current methane
concentration. If the addition is exactly equal to the removal, we don’t get
climate change from Methane.
------
acidburnNSA
Fracking is the primary reason a bunch of *nuclear power plants in the USA are
shutting down early. The over-supply of gas from fracking, plus the inherent
simplicity and efficiency of gas turbines makes them crazy hard to compete
with. The carbon-free nuclear plants weren't overly expensive to operate, but
when the electricity revenue dropped (especially in deregulated markets), they
started struggling mightily.
Nukes produce 60% of the carbon-free energy in the USA. As they shut down,
they're often replaced with natural gas, which locks in carbon emissions and
indirect emissions from pipelines and the wellheads.
It's on the nuclear people to figure out how to get operating costs safely
down. But boy howdy it'd be nice if their carbon-free nature could be valued
higher, or if fracking could be made more expensive.
This is also a big part of why coal is struggling, but this doesn't bother me
because coal is high carbon and high in killer air pollution (especially
outside the US. Air pollution kills 4 million people/year globally [1]).
[1]
[https://www.who.int/airpollution/en/](https://www.who.int/airpollution/en/)
~~~
aldoushuxley001
I've never heard anyone refer to nuclear power plants as simply "nukes"; just
a heads up that that's very confusing as most people will think of "nukes" as
nuclear warheads instead.
~~~
acidburnNSA
Thanks. In the industry it's common but glad to hear this feedback. I edited
above.
~~~
aldoushuxley001
Absolutely and thanks, I didn't realize it was so common, but sounds like it
is very widely used within the industry.
------
Mengkudulangsat
I've read about an unorthodox solution [1]. Can anyone comment whether these
are actually practical in the field?
[1][https://www.upstreamdata.ca/post/natural-gas-venting-how-
bit...](https://www.upstreamdata.ca/post/natural-gas-venting-how-bitcoin-
solved-a-160-year-old-problem)
------
known
Although Methane is far less abundant than carbon dioxide and stays in the air
for only a decade or so, molecule for molecule its warming effect (calculated
over 100 years) is 25 times higher
[https://archive.is/vThtr](https://archive.is/vThtr)
~~~
tremon
Ahh, but what happens to methane after that decade or so? Doesn't most of it
decompose via oxygen into water and carbon dioxide?
------
bbojan
Fracking also releases radioactive radon from the ground -
[https://www.sciencedaily.com/releases/2019/06/190618083347.h...](https://www.sciencedaily.com/releases/2019/06/190618083347.htm)
However, as regulations consider it "natural" (it wasn't man-made, although it
was released as a result of human activity), frackers are not required to do
anything about it.
So we have a repeat of the tragedy of diluted pollution - as long as your
pollution is invisible and evenly distributed (like radioactive waste from
coal plants), nobody cares. But concentrate it instead of releasing it to the
environment, and there's public outrage (high-level nuclear waste).
------
londons_explore
I have a hypothesis that fracking sometimes causes impermeable rock layers to
become permeable, and dissolved methane ends up in water underground. Over
years, that water ends up in tapwater/rivers/oceans where it isn't under
pressure, and the methane comes out of solution.
Anyone who knows the field better able to comment on my hypothesis?
------
ptah
Is this widely accepted? if so, why is there no G7 outcry like for the amazon
rainforests?
------
mogadsheu
The guys over at GHGSat in Montreal are doing great work. They take satellite
imagery of methane releasing sites and model their output. Not perfect but for
a broad estimate it’s not bad.
When I worked in energy we took part in an investment vehicle called the OGCI
to invest in them.
------
AcerbicZero
Maybe I've been listening to too many JRE podcasts, but at this point I think
I've accepted that the Earths climate was, and is going to continue to change
with or without humans. I'm all for cleaning up our poor environmental
behavior, but methods to manipulate the climate at a large scale should be
considered if we want to keep pumping out people, consuming resources _and_
enjoying the current temperate climate.
The big issue under all of this, is that we're going to start having to waste
resources paying down "climate-debt", and those resources could be better used
elsewhere. At the same time environmentalism has a clear and direct value, but
it can't be allowed to get in the way of progress as we really don't know how
much time we have before a big enough rock falls from the sky and Earth gets
another do-over.
~~~
mildweed
Things that should not be part of the "Should we work to mitigate climate
change" discussion: 1) Meteor strikes 2) Whether or not it'll be hard
Question: You believe climate change is not caused by our actions, but you
believe climate change can be stopped by our actions? How does that work?
------
dang
Can you please edit swipes like "Wrong." out of your comments to HN? It's
unnecessary and makes your comment acerbic when it would be just fine without
it.
The way to reply to a wrong comment is to supply correct information, which I
assume is what the rest of your comment does.
[https://news.ycombinator.com/newsguidelines.html](https://news.ycombinator.com/newsguidelines.html)
Edit: detached from
[https://news.ycombinator.com/item?id=20816911](https://news.ycombinator.com/item?id=20816911).
~~~
acidburnNSA
You bet, done. I was responding to the unfounded accusation of being
disingenuous. You're totally right that the community is better without that
kind of reaction, and the comment is way better with out it. Thanks. Forgive
me.
~~~
tempestn
I would like to thank you for your replies throughout this thread in general
though. It's actually ironic that you got called out for this one, since I was
just thinking that you were doing a great job responding to various
misconceptions in a positive, informative, and non-judgemental manner. Really
enjoyed reading your comments.
~~~
acidburnNSA
Hey, you're welcome. Thanks for saying so. I'm sure it's clear that I love
doing this.
------
fromthestart
A couple issues with the science here:
>Between 2005 and 2015, global rates of fracking went from producing 31
billion cubic meters per year to 435 billion
>In the last half of the 20th, century levels of methane in the atmosphere
rose. They then plateaued, and spiking in 2008.
So the shale gas revolution started in 2005, but the spike didn't occur until
2008? Also it isn't clear if this was a transient spike or the author is
referring to a sudden and sustained increase.
> While methane released in the late 20th century was enriched with the carbon
> isotope 13C, Howarth highlights methane released in recent years features
> lower levels. That's because the methane in shale gas has depleted levels of
> the isotope when compared with conventional natural gas or fossil fuels such
> as coal, he explained
First, all conventional gas is "shale gas," except for a tiny minority of
highly atypical cases where reservoir sections have experienced prolonged high
temperatures and matured further. Thus almost all natural gas ultimately comes
from shale - fracking is merely extracting it at the source, where it does not
significantly differ in composition from gas which has migrated to reservoir
in conventional extraction. So, second, I'm suspcious of the use of c13 to
differentiate fracking and non fracking sourced methane, but I'm not a
petrophysicist. It should be related to the age of the fluid, and I don't
expect shale gas to be substantially younger than reservoir gas in most cases.
~~~
_delirium
On the second point, whether ¹³C can be used as a signal to differentiate
fracked vs. non-fracked gas, the paper [1] does discuss this a bit. This seems
to be the key paragraph:
_Several studies have suggested that the δ¹³C signal of methane from shale
gas can often be lighter (more depleted in ¹³C) than that from conventional
natural gas (Golding et al., 2013; Hao and Zou, 2013; Turner et al., 2017;
Botner et al., 2018). This should not be surprising. In the case of
conventional gas, the methane has migrated over geological time frames from
the shale and other source rocks through permeable strata until trapped below
a seal (Fig. 2a). During this migration, some of the methane can be oxidized
both by bacteria, perhaps using iron (III) or sulfate as the source of the
oxidizing power, and by thermochemical sulfate reduction (Whelan et al., 1986;
Burruss and Laughrey, 2010; Rooze et al., 2016). This partial oxidation
fractionates the methane by preferentially consuming the lighter ¹²C isotope
and gradually enriching the remaining methane in ¹³C (Hao and Zou, 2013;
Baldassare et al., 2014), resulting in a δ¹³C signal that is less negative.
The methane in shales, on the other hand, is tightly held in the highly
reducing rock formation and therefore very unlikely to have been subject to
oxidation and the resulting fractionation. The expectation, therefore, is that
methane in conventional natural gas should be heavier and less depleted in ¹³C
than the methane in shale gas._
[1]
[https://www.biogeosciences.net/16/3033/2019/](https://www.biogeosciences.net/16/3033/2019/)
------
treggle
There must be solid reasons why fracking isn’t at fault.
| {
"pile_set_name": "HackerNews"
} |
Google Shifts Ads (To The Left) - vaksel
http://www.techcrunch.com/2009/08/11/google-shifts-ads-to-the-left/
======
ujjwalg
The fact that they have made a choice to increase their profits while
compromising on user convenience is a completely opposite direction of what
google was based upon IMO. Google is a company I used to look to whenever I
had to think of a business problem(User convenience/solving a problem before
making any money). If you are providing user what they want business model
will work out, which seemed to be the case but not anymore, I guess.
| {
"pile_set_name": "HackerNews"
} |
The PC is dead. Why no angry nerds? - pauljonas
http://futureoftheinternet.org/the-pc-is-dead-why-no-angry-nerds
======
typicalrunt
_Why no angry nerds?_
Because nerds are still using their PCs to create mobile and web-based
software. The PC may (just may) be dead for the hordes of average consumers
out there, but it'll never be dead for those people creating things -- that
is, until you can adequately create a mobile or web application on an
iPad/iPhone.
Does anyone truly believe that scores of software developers writing financial
software for banks are going to trade in their PC to type on an iOS device?
Don't bet money on that (at least for the short-term).
~~~
astine
The complaint that the author makes isn't about consumption vs production
devices, it's about curated experiences. Apple can veto your income on the
insistence of popular opinion or on a whim, or with no stated reason at all.
The author extrapolates that this can lead to an effective restriction of
freedom of the press by Apple and other gatekeepers.
This is a real problem and is not alleviated just because _you_ still use a PC
to write software.
~~~
onemoreact
Your forgetting about the web. Apple does not veto websites just things in
their little walled garden.
~~~
bad_user
Considering how Apple controls the operating system, the app store, the
browser and the whole device in general, it wouldn't be technically hard for
Apple to censor websites either.
Plus, Apple is restricting the browser on purpose, to make it less useful.
Tell me, how can you create an online image editing software, if Apple doesn't
allow you to upload files from iPhone's browser?
And what Apple is doing today would have been unthinkable a couple of years
ago. Censoring the web is perfectly within their grasp and you should see
that, otherwise you're not seeing the forest from the trees.
~~~
onemoreact
The most popular home computers have always been walled gardens where the
manufacture got a cut from every peace of software sold. A huge chunk of all
consumer software runs on them and there approval could easily make or break a
software development house. We just called them 'game consoles' and
conveniently forgot that their approval process was far more expensive and
difficult than anything Angry Birds needed to bother with.
So, I don't think there is anything all that novel about the iOS approach and
I find it hard to think of such things as an erosion of control when consoles
predated the PC in most peoples homes and tended to be far more draconian than
iOS. Do you even remember how bad phones and their app stores where like pre
Apple?
PS: Microsoft had no problem getting skin into this game and started taking
their cut well before Apple produced iOS. Despite what the article says.
~~~
bad_user
People mention 'game consoles' a lot, as if that's a good thing.
However, the best games have always been running on PCs first, often with
exclusivity. Starcraft, The Sims, World of Warcraft, Half Life, Counter
Strike, SimCity, Quake, Diablo 1 & 2 and the list can go on.
I own a Wii as it came free with my Internet subscription. It's gathering dust
since last year. The games are simply not addictive. And this feeling I had on
PS 2 too - the games feel too dumb and non-engaging, lacking taste. Which is
why I never bothered.
Some people like these games consoles. I never did.
~~~
lloeki
> However, the best games have always been running on PCs first, often with
> exclusivity. [list]
Confirmation bias. See Mario, Contra, Chrono Trigger, Sonic, Final Fantasy,
Resident Evil, Halo, Alan Wake, Uncharted, Metroid, Heavy Rain, Sega Rally,
Gran Turismo, Tekken, Street Fighter, Soul Edge, Forza Motorsport and the list
can go on.
------
vladd
I saw in the supermarket the other day a group of parents buying for their
young child a so-called "mini-laptop" that costs $100 and probably only knows
to do addition, multiplications and some basic word-games on a cheap LCD
screen. And I realized that for them, and for a large portion of the end-
users, lack of education in IT makes it very difficult for them to see the
difference between the said laptop and a regular PC device.
Apple realized first what this lack of knowledge can do: the ability to lock-
in a product so that it runs only your AppStore's apps is not only good for
the high-end of the market that is willing to sacrifice freedom in the name of
usability and beautiful design, but rather more importantly it's good for the
ignorant masses that don't even realize this fact when they buy the product
just because it's fashionable to do so.
~~~
k33n
You think Apple cares about locking down a device just for the sake of taking
advantage of the ignorant masses?
The truth is a lot simpler than that. They aren't trying to enslave humanity.
They are just control freaks with a culture of creating art. It's ingrained in
their corporate culture that people shouldn't decide how to use Apple's
products. They believe that they should control that.
Whether you agree with that or not is your own choice. But it's just plain
stupid to say that there is some kind of plot against "freedom" here.
~~~
smallblacksun
I agree that they aren't plotting against freedom, but not for the same reason
as you. I think that they want to make money, and think that controlling
content is best the way to do that.
~~~
k33n
Of course they want to make money though. They are a business. However,
they've forgone more profitable business models in the past in favor of
tighter control and they'll likely continue to do that.
~~~
troyastorino
It's hard to argue that they've forgone more profitable business models, as
they are in the top 10 most profitable companies in the US, and are the most
valuable company.
------
WiseWeasel
I'm optimistic for several reasons:
1) Less need for IT support for family and friends;
2) I'm well-served with the computers I assemble and purchase, and even if
mainstream operating systems continue their slide towards not serving my needs
well (from my vantage point looking at OS X Lion and Windows 8), I am
confident that there will be solutions out there, possibly increasingly
exemplified by Linux, that facilitate web development and mundane user
empowerment;
3) The web has become the democratic platform for publishing interactive
content to the masses, which no vendor would dare attempt to exert control
over.
Now, the real question is who's going to capitalize on this amusingly-phrased
headline and come out with the Angry Nerds mobile game, where you launch a
variety of nerds at iPads hidden in the temporary safety of their elaborate
Apple Stores? Let's see, there's the fat, bearded nerd, the pale, skinny, tall
one, the kid with glasses, the token girl nerd with freckles and glasses, and
the $1.99 in-game purchase Darth Vader helmet nerd.
[Edit]
Looks like I wasn't the first one to think along these lines:
<http://www.atlassian.com/en/angrynerds>
[/Edit]
------
dasil003
> _The iPhone restricts outside code, but developers could still, in many
> cases, manage to offer functionality through a website accessible through
> the Safari browser. Few developers do, and there’s work to be done to ferret
> out what separates the rule from the exception._
I don't want to be paranoid, but I feel like what Apple has done is
brilliantly nefarious. They've given developers an offer they can't refuse:
for 30% we'll sell your software frictionlessly. Sure you are giving up
control, but how much does requiring a payment form and non-standard, non-
obvious, potentially painful installation process cost an indie developer. 30%
is always far less than the losses of a traditional website sales funnel,
potentially by 2 orders of magnitude. The App Store simply _sells more
software_.
Of course it has corrosive effects on the developer community in that prices
are driven down (maybe not as much as volume increases, but still) and control
is forked over the Apple. But what can developers do? It's a game theory
problem. If everyone refuses to play Apple's game then maybe developers and
innovation win, but if one breaks the line they stand to make a fortune.
I'm really not looking forward to the day when Macs go App Store only. If that
happens it will probably mean I have to switch to Linux. It saddens me that
the future of computing may be completely locked down, but it's hard to argue
against it if for no other reason than it offers the most promise for actually
making users safe as malware proliferates and becomes more sophisticated. Part
of me thinks the free software and Internet as we know them today can not
last, and if it wasn't the App Store it would be some other powerful interest
eroding our digital freedom in the name of profit or control. This line of
thought makes me think that maybe RMS is not such an extremist after all, but
merely an equal counterbalance to the forces of power and greed shaping out
future.
~~~
zmmmmm
> 30% is always far less than the losses of a traditional website sales
> funnel, potentially by 2 orders of magnitude. The App Store simply sells
> more software.
I don't understand how your maths works. You are asserting that a traditional
website sale costs 3000%, thus losing the seller 29 x the price of the item?
Obviously I am misunderstanding you but I don't know how to make sense of what
you are saying.
I'm always curious about the people who say 30% is a good deal. I sell
software directly over the web. PayPal charges me something like 0.50 + 2.5%.
I do host my web site but I would have a web site anyway. Bandwidth associated
with sales is negligible, support is a cost but again, Apple wouldn't do that
for me. 30% is bigger than the cut I give to generic resellers who actually
come to me to ask to list my software in their stores. I can see that Apple's
store can increase volume (but then, we have nothing to compare it to - we
will never know how well iOS software might have sold without the app store) -
but I don't see how it's anything other than extremely expensive in terms of a
way to sell software.
~~~
dasil003
Losses on the sales funnel means the number of people that drop off from the
first page of your website all the way through purchasing the product. Getting
someone to pull out a credit card is a huge challenge, there's no way you can
make it anywhere near as effective a one-click purchase like Amazon or iTunes
has. This is not even counting the discovery aspect of the store which is a
powerful discover channel completely supplemental to whatever marketing you
are doing on your own.
------
mikemarotti
People have been saying the PC is dead just about as long as I've been
Personal Computing.
The role of the PC may be changing, but that is far from meaning that it's
dead.
------
jsz0
There are definitely angry nerds out there but they are in a position of
arguing against a better user experience for the vast majority of people who
don't want to have malware, a broken OS, their personal data stolen, etc. They
should be trying to find less restrictive solutions that provide the same
benefits. Otherwise the companies that are moving forward with making a better
end user experience are going to win in the long run.
~~~
bad_user
Do you feel secure when using an iPhone?
JailbreakMe is a website that can root your iOS - so let me emphasize this - a
freaking website that can get root access on your iPhone just by visiting that
website.
It originally relied on some bug in the PDF reader. Then Apple released an
update. Then JailbreakMe got upgraded to use a FreeType parser security flaw,
again in that same PDF reader and so it works with iOS 4.3.3 - then Apple
released iOS 4.3.4 to patch it. But now an update for iOS 5 is apparently in
the works.
At this point you may now feel that Apple is in control, that they are
patching iOS at a rapid pace and so on. However, these vulnerabilities are
PUBLIC. Even the source-code for JailbreakMe was published. Bad people don't
do that ;-)
So you can argue all you want against malware, a broken OS or about evil
people stealing your data, however the iOS platform is nothing more than
security theater - it keeps the script kiddies out, it gives you an illusion
of safety, but it won't keep out the people with the right resources and
experience. And those people are the people you should be worried about.
~~~
dextorious
"Do you feel secure when using an iPhone?"
Yeah, I do. There is a site that can take advantage of some vulnerability? Not
impressed. Until there are tons of such sites, like they are for PCs, and
until there are tons of actual iPhones harmed that way, as PCs are, I feel
secure.
"however the iOS platform is nothing more than security theater - it keeps the
script kiddies out, it gives you an illusion of safety, but it won't keep out
the people with the right resources and experience."
I don't care about those people. I'm not targeted by some Dr. No style guy. I
care about general experience and the law of large numbers.
And, statistically, I'm totally safe using an iPhone. When that changes, you
can write to me again.
~~~
bad_user
until there are tons of actual iPhones harmed
that way, as PCs are ...
In 2008 an estimated 1 billion PCs or more were in use and 95% of those had
Windows on it. By contrast, in 2011 by Apple's statement 100 million iPhones
were sold to that date and let's be reasonable here - many of those were
upgrades, so the total number of users is less than that.
So you can feel safe, but only because your platform is not popular enough
right now, or useful enough for anybody to bother (PCs are wonderful for
building botnets).
I don't care about those people. I'm not targeted
by some Dr. No style guy.
But you should, as those people are the people with the means to distribute
mallware on a massive scale. And they will, sooner or later.
And, statistically, I'm totally safe using an iPhone.
Statistically? I just gave you as an example a freaking website that can do
whatever the author wants on your iPhone and you're giving me statistics?
Seriously?
~~~
dextorious
"""So you can feel safe, but only because your platform is not popular enough
right now, or useful enough for anybody to bother"""
That's a stupid fallacy. In the 90's the Mac had a tiny marker share, nothing
like the near 10% it has today, and it still had tons of viruses. And even
platforms like the Amiga and Atari had viruses, while having insignificantly
less market share than PCs at the time. The OSX Macs, on the other hand, has
only seen a few trojan horses and no actual virus distributed outside of some
lab or something.
So, it's not just numbers or units, or relative percentage of market share.
Other things count, stuff like the UNIX like permissions OS X had from the
start, compared to the silly Windows 3->XP user being automatically admin.
"""Statistically? I just gave you as an example a freaking website that can do
whatever the author wants on your iPhone and you're giving me statistics?"""
I'd care about that website when ACTUAL iPhones in the wild ACTUALLY VISIT
websites such as this and are hacked, in large numbers.
Until then, I could care less.
It's the difference between being scared of violence in a seedy neighborhood
and being scared of being targeted by a serial killer. Yeah, they do exist.
What are you chances?
------
howeyc
I'm not as quick to discount the point as the others here,
Look at it this way, assume in the not so distant future you can do it all on
the tablet as opposed to the desktop and no longer nead desktops, having them
die a slow, slow death: Counter-Strike, WoW, Development (Github text editing,
Cloud9 IDE,... etc), word processing, browsing, you get the idea.
But instead, the applications are vetted and controlled by the tablet OS
maker. Now instead of law makers trying to get every search engine to block
something, or tear-down registars all over the world, they just go to tablet
OS maker 1 and 2 and have them take down access to website/app. No more
website/app for anyone as desktops are dead and the tablet is so locked to
shit you can't change a thing (secure boot anyone?)
Now, I'm not sure death is as current as suggested, but I see the trend the
blog post is referring too.
~~~
roc
Why would Apple be any more willing to turn off web access to a domain than
the dominant ISP or backbone provider or the dominant PC vendor or even just
the ISP/DNS provider for the site in question?
It seems silly to imply the situation is somehow more plausible/likely with a
tablet-based computing landscape.
~~~
howeyc
I don't believe any ISP/DNS provider is "willing" to turn off web access to
any paying customer (domain owner). However, it has nothing to do with
willingness but instead with "force" (Judges, Congressman, Clothing
manufacturers, ...), people and companies don't like paying fines and going to
jail. Let's say there's 250 domains that are unliked, on 5 different
registrars, 10 different ISPS, but only two tablet OSs with walled gardens,
which place would you logically force to stop access? Of course if it's apps
you don't like instead of domains the answer is trivial.
Now, we are not in this scenario right now no matter how horrific you think
walled gardens are. Probably not worth speculating too much further as the
future is hard to predict, and this is just a possibility of the blog post
trend extrapolated forward X number years.
------
jes5199
People don't use PCs as general purpose platforms anymore, anyway. I can code
anywhere that there's a unix environment - or a connection to AWS. And people
can use my software anywhere there's a browser. The ability to build and run
.exe files hasn't really been a major enabler of geekery since, I dunno,
Napster, I guess wa s the last time.
------
vectorpush
The PC will never die. When the average tablet is robust enough to
productively operate my entire development stack, or modular enough to allow
enthusiasts to build performance gaming tablets, maybe we'll see an end to the
PC, or perhaps that which we call a rose by any other name will smell just as
sweet. I don't see how a tablet is any different from a PC except for form
factor, especially when they catch up with PCs in terms of general utility.
~~~
dextorious
"""The PC will never die. When the average tablet is robust enough to
productively operate my entire development stack, or modular enough to allow
enthusiasts to build performance gaming tablets, maybe we'll see an end to the
PC"""
How about, in 10 years most people just get a tablet + keyboard or something,
and the PC doesn't die, but get's 4-5 times more expensive because of lack of
demand?
~~~
vectorpush
_get's 4-5 times more expensive because of lack of demand?_
Why would that happen? Tablets are simply low power PCs wrapped in a
touchscreen LCD. If these tablets were optimized for performance instead of
battery life many of them could outperform most consumer PCs.
My point is that PCs won't disappear, their functionality will simply
transition into tablets as they become capable. A tablet + keyboard + mouse =
a PC.
~~~
dextorious
"""Why would that happen? Tablets are simply low power PCs wrapped in a
touchscreen LCD. If these tablets were optimized for performance instead of
battery life many of them could outperform most consumer PCs. My point is that
PCs won't disappear, their functionality will simply transition into tablets
as they become capable. A tablet + keyboard + mouse = a PC""".
It's not that simple. What you describe is desktop PC vs laptop.
Tablets on the other hand have different operating systems (at least the only
tablet that people > 1% actually buy, the iPad) than PCs.
We can imagine a feature where most people use tablets, perhaps equally
capable as any PC, but WITHOUT today's full PC OS flexibility, and PCs are
left for only small professions (programmers, creatives, etc).
~~~
vectorpush
_tablets on the other hand have different operating systems (at least the only
tablet that people > 1% actually buy, the iPad) than PCs._
This is only true in today's world. In a future where tablets are so capable
and ubiquitous that they've displaced laptop and desktop computers, both
paradigms will have long merged. Microsoft, Google, Canonical, and others are
all in the process of transitioning their platforms into x86 or ARM (whichever
they lack). Additionally, AMD and Intel both have true x86 answers to ARM on
the horizon. The flexibility of the x86 platform is not going anywhere any
time soon.
------
rplnt
Last time the PC was dead, it was because of the gaming consoles. All the
games are there, web browsers too.. why would anyone want a PC? That was half
a decade ago and PCs are still here.
------
VladRussian
instead of 1 single core single cpu PC shared with my wife, we now have
(counting only ours, not company issued laptop and 2 Mac desktops) 3 PCs (1
desktop type and 2 are server dual quad cores opterons (and 1 old dual dual
core)), 2 laptops and 1 netbook, and 3 iOS devices. And i'm looking to get
soon new SB 2[5|6|7]00K (or may be even indulge with 3930 - it is Christmas
time :) + mobo + pile of DDR3 to upgrade the desktop. I'm a happy nerd as
RAM/CPU/HDD and big LCDs are so cheap these days.
Edit: just today a new store opening nearby got several cargo pallets of new
Dell's PC based hardware (POS and pure PC).
In the next few years 5+ billions of people will get their first computing
device (and the rest 2 bilions - their first 2nd, 3rd, 4th ... computing
device). The vast majority of this growth will not be a PC. That's clear. Yet
this growth in mobile devices will "synergize" an unprecedented growth in
PC/server hardware dwarfing whatever we've seen so far.
------
sliverstorm
The mainframe is dead too.
By the way, on an unrelated note, did you know mainframes are still IBM's
biggest business?
~~~
antidaily
It's not consulting?
~~~
sliverstorm
Maybe, I'm not an IBM insider. That's just what I've heard.
Maybe they are consulting for their mainframe customers?
~~~
flomo
Some years ago, most of their consulting revenue was derived from IBM
platforms. Since then they've absorbed a number of other consulting companies
though.
------
pawn
A lot of people here are quick to discount the title "The PC is dead", but I
think the actual meat of what he was getting at merits consideration - walled
gardens are a lot more popular today than they used to be. No doubt, there are
certain innovations that we may never see because of this. For example, what
if there's a Mac developer out there who has ideas for a better browser than
Safari. You're never going to see it, because Apple won't let you. Today, he
has the option of bringing it to PC, but that might be too big of a barrier
for him to jump if he's one of those guys who won't buy one.
Also, its not too hard to imagine a day when Microsoft follows Apple and makes
their own marketplace. Probably won't be anytime soon, but it would be naive
to say it could never ever happen. The main thing stopping them is probably
the fact that they're still the big dog and can't get away with doing what
Apple does. People judge them differently. In a few more years though, I could
see them pointing at what Apple does and saying, "There's precedent here.
Let's go for it." Some people would jump ship and move onto Linux, and say "I
still have freedom" but what about your mom and dad who will keep buying the
Walmart special every Black Friday? As an indie developer these things are
limiting what you can do.
I think the solution here is to take a look at what people like about these
marketplaces - a convenient way to find apps, and duplicate it - without the
bad stuff. Make a marketplace and don't limit what goes on it. Also, make less
profit for yourself than what your competitor (Google and/or Apple) makes. The
toughest thing is traction. Getting it in front of eyes as something to use
instead of the other one. Get past that, and you're set. Am I missing
anything? Has this been done and I just don't know about it?
~~~
GHFigs
_Also, its not too hard to imagine a day when Microsoft follows Apple and
makes their own marketplace._
[http://www.zdnet.com/blog/hardware/windows-8-app-store-
will-...](http://www.zdnet.com/blog/hardware/windows-8-app-store-will-be-the-
only-source-of-metro-apps/14873)
------
caller9
The scary part is when parents who just need a consumption device only buy a
tablet. How many of you would've been able to program at a young age if your
parents didn't buy the family PC.
~~~
stan_rogers
Some of us wrote our programs out carefully on paper, debugged them as well as
we could by stepping through them mentally (and making register annotations,
etc., on paper when required), then transcribed them to electrographic "punch"
cards for the school's batch run on the local university's mini/mainframe once
a month.
Having a computer makes programming more convenient and less theoretical (bugs
happen occasionally despite correct code), but it's not essential to learning.
It's a lot less likely to happen in that way in the current environment (with
ubiquitous PCs), but if we ever get to the point where PCs (or their
equivalent) aren't everywhere anymore, the interested geeks will find a way --
just like we did in the old days.
------
giulivo
It will probably make me look pathetic (angry nerd maybe), but I don't have a
tablet and I don't use much my smartphone, if not for calling people.
I found these devices to be very uncomfortable for my tipical usage. The only
good addon I've got in my smartphone was the GPS navigation.
I could never browse the internet on a smartphone and I could never look at a
tablet more than 30mins while also being comfortable in using it.
And if I had to plug in into a tablet a standard keyboard and a nicer monitor
to make it "usable", I would rather buy some mini PC.
------
darksaga
I remember when a friend of mine got his first i-pad. We're both developers
and he said he could do some basic editing with one of the included apps.
First thing I thought was, "Oh yeah, I'm going to write code on a 7" screen as
opposed to my 23" monitor."
The PC isn't dead and it won't be for a long time. Although there are a lot of
lines being blurred between devices (tablets vs. smartphones vs. laptops),
there's still a TON of people working at large corporations who you'll have to
pry their PC's out of their cold, dead hands.
------
jroseattle
> Why no angry nerds? Mostly because very few people really believe the PC is
> actually dead.
Mobile is the big wave, yet I don't know anyone who says "I have a phone, I
sure don't need a desktop/laptop."
The closest possible thing is a tablet. Yet, the most common refrain about any
tablet to do that I've heard is "well, it doesn't do this" or "it doesn't do
that". On the flip side, I've yet to hear anyone state that they wished they
didn't own a PC, since their tablet does everything they need.
~~~
ricardobeat
Your friends are not representative of the whole market. Mobile is larger than
desktop in Africa: [http://www.gomonews.com/mobile-internet-usage-soars-past-
fix...](http://www.gomonews.com/mobile-internet-usage-soars-past-fixed-web-in-
africa/) (and this is from 2010).
Kids 5 years from now will find desktop computers and current laptops strange
and kludgy.
~~~
dimmuborgir
In most so called third-world countries mobile is much larger than desktop
simply because mobile is more affordable.
------
qdog
I was just looking at building a new computer, I was contemplating a dual-cpu
setup for using vm's, hoping it would be easy to run copies of various OS's on
one box and use all the different streaming technologies for tv output.
Looks like about $2k or more.
While it would be great to be able to coordinate all the streaming media using
my phone as a remote, my phone doesn't do most of the 'computing' tasks I
like, and reading HN on a 4" screen is handy, but not optimal.
I think his point is exactly that if I want everything to work, I need an
Apple and a Windows machine running (nevermind whether it's legal to run Apple
in a VM, I'm under the impression they don't want you to, but if I purchase a
copy of OSX, it's not clear I wouldn't have the right), and maybe a Chrome
machine, because they DON'T WORK TOGETHER BY ARBITRARY RESTRICTIONS.
I think there is another guy that likes to rant about this walled garden
approach. Personally, I've been trying to use iTunes on a windows box to play
to an apple tv recently and it is sucking. Music plays fine from an iPhone. So
now, my wife is all ready to pay the Apple Tax for a new computer. To just get
music to play. They fact that apple makes a cool device is fine, the fact they
don't want to play nice with any of my other equipment makes me loathe to buy
a new one, but apparently I'm not with the mainstream on this one. But count
me as one of the angry nerds.
~~~
Tobu
Dual-CPU? You can get a dual-core or more for _far_ cheaper than that. Look to
AMD Phenom IIs for the middle-range, you get better performance for the buck
especially if you factor in the motherboard.
Besides being faster, this kind of machine is far cheaper than a hand held,
locked down device. Here are the costs of ownership, about $2000 for two
years:
[http://www.steamenginefinancialcoaching.com/2011/02/14/the-r...](http://www.steamenginefinancialcoaching.com/2011/02/14/the-
real-cost-of-ownership-iphone-4/)
------
iamandrus
The PC isn't dead. It never will be. Sure, it's changing roles in a
smartphone/tablet world, but it'll never "go away."
------
quadyeast
Why does Sophos block this site? This is the first time in 3 years that I have
seen this warning:
High Risk Website Blocked
Location: futureoftheinternet.org/the-pc-is-dead-why-no-angry-nerds
Access has been blocked as the threat Mal/ObfJS-CB has been found on this website.
------
chc
For any writers who want to talk about how the PC is dead: Try guesstimating
the number of people who will use a PC on a given day, then look at how many
people read your blog/newspaper/magazine in a given YEAR. (Hint: Not even
HuffPo will win this one.)
------
jl6
Not sure what it means for a particular technology to "die". A lot of "dead"
technologies are "alive" and kicking in the Enterprise world. Perhaps "undead"
is the word you're looking for...
------
cavilling_elite
Wow, I've been blinded. I have never equated what Apple is doing with the app-
store/"taxing" and the MS IE anti-trust case.
The empowerment of the censor. Scary words.
~~~
vacri
Apple's store is a standard retail cut, it's nothing like taxes. Do you
complain about the middleman cut with every other thing you puchase?
The monopoly issue is a different beast to the retail cut issue.
------
vph
If you believe that in the future, people do accounting on the ipads, write
papers/programs on the cell phones, then the PC is dead.
------
quellhorst
PCs are trucks for getting serious work done.
------
libraryatnight
I'm too busy doing awesome things on my super awesome PC to be angry.
------
georgieporgie
Alarmist fluff.
_And every app sold for the iPhone would have 30 percent of its price (and
later, that of its “in-app purchases”) go to Apple. Famously proprietary
Microsoft never dared to extract a tax on every piece of software written by
others for Windows—perhaps because, in the absence of consistent Internet
access in the 1990s through which to manage purchases and licenses, there’d be
no realistic way to make it happen._
Microsoft never provided a complete distribution channel for software, either.
Complying with Apple requirements and limitations is annoying. However, the
consumer gets reasonably vetted software, and the developer gets a single
method of distribution.
What's the complaint, again?
~~~
dmethvin
iPhone owners are prohibited from going to any other store. It's a monopoly.
Sure, Mussolini was a dictator, but he made the trains run on time.
~~~
shaggyfrog
Sigh. Yes, of course, Apple is _just_ like a violent dictator who supported
genocide.
~~~
jsight
_Whoosh_
The point isn't that Apple is just like a violent dictator.
The point is that these statements are (mostly) irrelevent to each other. The
trains can run on time without a dictator.
The user can have a fine walled-garden experience but still retain the option
to (in a reasonably supported way) tear down the walls.
~~~
scott_s
I'm not sure that your last sentence is possible. That is, if a device
explicitly has an option to go outside of the garden, then I think most
consumers will _expect_ to still have full support.
------
cq
The PC is dead to idiots. It's actually still quite awesome for those of us
who are producers, not consumers.
~~~
pclstyle
And what good are producers without consumers?
~~~
gujk
The consumers are still here, on iPads.
------
contextfree
zero out of two - the PC isn't dead, and there are plenty of angry nerds.
------
killnine
I am angry. really angry.
What is the solution to make the gatekeepers and the gatekeeper-supporters
obsolete?
I don't need them- why do you?
| {
"pile_set_name": "HackerNews"
} |
Thanks, but I have accepted another offer - bigfatmonkey
https://medium.com/@TreyceMeredith/thanks-but-i-have-accepted-another-offer-aaf90107ddb
======
lightbyte
This reads and looks like it was written by a teenager. Quite ironic that it
is preaching professionalism.
Edit: I did not mean this as "only girls are unprofessional" so I removed that
part. Teenage boys are equally as unprofessional, but I don't think they
express it in the exact same way. If this was filled with dick jokes and the
like I would have written "teenage boy".
~~~
a13n
Sorry, what about being a girl conveys unprofessionalism?
~~~
jeremy7600
"teenage" girls aren't the height of professionalism, I think was the thrust
there. The use of the word "like" frequently denotes this. Nothing wrong with
girls, period. But teenagers aren't looked up to as the paragons of
professionalism.
~~~
a13n
Are teenage boys the height of professionalism? I think the gender comment is
unnecessary, irrelevant, and harmful.
~~~
kentrado
Oh, come off it already. It is clear what he meant, you are just trying really
hard to be offended by something.
~~~
delphinius81
Yes it was clear the poster was being derogatory to girls. It's also clear you
are trying to normalize and justify such behavior. Both are unnecessary.
------
delphinius81
Reading this, all I could think was "Welcome to job searching." Companies that
pull some of these things are places you don't want to work for. Contact
emails you and then ignores your replies without reason? Run away. Have a
meeting scheduled and they don't show? Run away. Get interviewed but then get
told you don't have enough experience? Run away.
Finding a job isn't just about getting paid the most money. You need to be in
a job where you value the work and the people you work with. Because if I'm
the one hiring for the role, if I detect that your only interest is in getting
paid, I'm going to run away.
------
scandox
I value talented people but sometimes during the hiring process I find myself
thinking:
1\. You're spoiled
2\. You're overpriced
3\. Your lack of humility is going to burn us one day
Technical folk and designers hold the whip hand right now, but that's no
reason to abuse it.
~~~
toomuchtodo
From the other side of the table, employers usually hold all the cards. I
don't care about your mission, your values, just show me the money. This is a
transaction, not an interview for me to be a part of your family.
I will be polite, but if you offer a dollar less in total compensation than
the other person, I will accept the other offer and thank you for your time.
Have to make hay while the sun is shining, and the music could stop at any
time. Don't say someone is overpaid when the issue is you don't want to pay
market rate; you're bringing emotion into a business transaction. Can't hire
someone for a role? You're not paying enough or you're not willing to train
someone to be as niche as you need.
Source: Infosec, ~2-3 contacts a week from recruiters.
EDIT: To those who are replying that I'm detached, not who you'd hire, etc,
that's perfectly fine. Hire someone who undervalues their time and values your
team or your working environment over getting paid what they're worth. I work
hard to save up faster for financial independence; I want to spend my time
working on what I want, not what someone else wants me to work on. Life is
short.
~~~
tensor
It could be that people offering less simply don't value your skills as much.
I've certainly seen people who think they are senior but in tests come out
only intermediate. That someone else will buy into their senior claim doesn't
mean the companies that do a better job judging skill level don't pay market
rate. You seem to be implying that there is no such thing as being overpaid,
which is pretty nonsensical.
~~~
toomuchtodo
> It could be that people offering less simply don't value your skills as
> much.
Entirely possible. Luckily, someone's inability to properly value an
employee's skills doesn't prohibit said employee from finding an employer who
will pay the market clearing price for their time. Value is determined by the
market, not an individual hiring manager.
> You seem to be implying that there is no such thing as being overpaid, which
> is pretty nonsensical.
There isn't such a thing. Someone is getting paid what they were able to
negotiate between two willing parties.
------
ryandrake
The hiring process deck is thoroughly stacked against the job seeker, allowing
companies to get away with all kinds of abuse, flakiness and lack of follow-
up/courtesy. The trick as a candidate is to simply not take anything
personally, and not get focused on That One Employer. Someone else mentioned
"professional detachment" and that's really important as it's an employer's
market (and it usually is).
Just a rough napkin-estimate, but I'd guess that out of the recruiters that
URGENTLY REACH OUT WITH A GREAT OPPORTUNITY PLEASE RESPOND, 75% won't even get
back to you after your polite "Thanks for getting in touch, I'm interested!"
response.
Of those that get back to you and do a brief phone chat, you will never hear
back from 75% of them about going forward with a phone screen with the
company.
After the phone screen(s) with the company, 75% won't get back to you about
going on-site for the "whiteboard hazing" interviews.
Of those, 75% will ghost you. So you should _expect_ around a 0.4% hit rate.
------
desireco42
This is bad advice for designers, sound very spoiled and prepotent. I don't
think good experienced designers would be swayed by this, but beginners might
read into this.
Focus on your skills and what you want to do, and you will be fine. Stay away
from 'priests' like this, they will not help you, you are only there to admire
them.
One skill that I would always value in designer is that he can produce in
medium we are using ie. html. Photoshop is bad starting point and this alone
will put you above others (in my humble opinion).
~~~
sidlls
Is the "spoiled" part because of the presentation or the content? My read of
the content is that it seems routine (for at least this person) to have hiring
staff at companies routinely ignore follow-up emails, miss meetings, and
engage in the designer equivalent of data structures and algorithms
pedantry/hazing in the software side. I'm not sure wanting hiring people to
have sufficient respect for the author's time and education to not do those
things is "spoiled."
~~~
desireco42
Like, you know, like, presentation :).
I agree that people miss appointments etc, but this is again distraction from
bigger picture, where you would focus on your skills and what you really want
to do. And especially how you want to do it, process is also important.
------
MikeTheGreat
Reading this, it seems like one of (at least) two things might be true:
1) The article is written for other members of the author's group, but wasn't
really meant for broad consumption. Yes, it's online, in public, etc, but
either the author wasn't thinking about that and/or ignored it. I say this
because it feels like something one would say when complaining to friends at a
pub after hours. It's got this "we've all been here, don't you just hate it?"
sorta vibe to it.
2) The article is written to be click-bait/incendiary on purpose. A while back
(a year or two?) someone else wrote something similar and it blew up big. More
than anything it gave people something to shake their fist at. It gave older
people an excuse to look down on "the kids today", younger folk could shake
their fist at the poor treatment they receive from older people. I wonder if
this author is also trying to go viral in a similar sort of way. I feel like
it takes a particular skill to write these sorts of things - it's gotta be
believable that a 22 year old would write this, but also push enough of the
older generation's buttons that it'll inflame them, too.
3) Obviously, there's other possibilities that I hadn't thought of that might
be true :)
------
johnwheeler
I question the design sensibilities of any designer who makes such liberal use
of animated gifs.
~~~
whipoodle
I hate this thread
------
angryasian
[https://news.ycombinator.com/item?id=15094804](https://news.ycombinator.com/item?id=15094804)
the last time this was posted
------
fastbeef
None of the points he makes are invalid per se, but the language and tone of
the piece makes him come across as whiny at best, spoiled and entitled at
worst.
~~~
jogjayr
What was wrong with the language and tone? Apart from it being informal and
casual?
It's Medium, not an industry whitepaper. There's no expectation of Serious
Business Writing™ here.
| {
"pile_set_name": "HackerNews"
} |
Slashdot Launches Re-Design - Mithrandir
http://meta.slashdot.org/story/11/01/25/163257/Slashdot-Launches-Re-Design
======
jessriedel
I was hoping Slashdot was going to reduce the huge amount of padding around
each comment, but no luck. For instance, on my screen right now I can fit 13
comments on HN or reddit but no more than 5 on slashdot. This makes it _much_
harder to navigate comment threads, connect replies with comments, etc.
I'm I the only one who thinks this is easily the worst part of Slashdot? It's
reminiscent of a bad forum where each 3-word quip comes with an avatar, a
custom border, and a 5-line signature.
~~~
moe
The original slashdot was actually very readable. Here's a screenshot:
<http://images.slashdot.org/faq/screen_nested.gif>
Not pretty, but readable and very usable.
The padding and useless decoration that you (rightfully) complain about was
introduced with the first ajax-redesign a few years ago.
------
atgm
I liked slashdot more when it was plain, quick-loading HTML. I liked most
sites more when they were plain, quick-loading HTML. Ain't It Cool News is the
only other one coming to mind at the moment, though.
~~~
jinushaun
Google's Marissa Mayer said it best: The web likes to be square. Rounded
corners and other elements slows down a site.
~~~
julian37
Huh, why would that be? Drawing a round corner is in fact more complex than
drawing a square one, but surely for a few dozen rounded corners the number of
additional CPU cycles spent pales in comparison to things like anti-aliased
font rendering, calculating CSS box model layouts, decompressing gzipped
contents and the myriad of other tasks that a browser has to perform during
loading and rendering of a web site nowadays.
~~~
sanswork
For 4 rounded corners you probably need to download 4 images. Thats 4 extra
connections to the server to connect to, download, process.
It isn't the processor time that is taking longer its the transfer time.
~~~
robgough
Most modern browsers (all but IE I believe) will let you round corners with
CSS. An extra line or so of CSS is hardly much of a transfer overhead.
I believe current thinking (on non-enterprise sites anyway) is to let the IE
people have their square corners, and those on new browsers get all the curvy
goodness.
~~~
sanswork
Yes, that is why I added the probably. The vast majority of rounded corners I
see are still done with images however.
------
peng
The old Slashdot design bothered me so much I proposed a minimalist layout
last year: <http://nylira.com/p/slashdot>
Thoughts on this one:
Slashdot's homepage is much cleaner than before, but the thick green bars
denoting each story is visually oppressive. I understand that it's a branding
element, but it hampers readability.
The fixed navigation annoys me, but that's a personal preference: I don't
think the menus are important enough to be constantly on-screen.
I see they still haven't added a max-width to text columns either. It's
difficult reading comments on a 1440px+ screen.
~~~
robgough
Much prefer your clean design, but I'd move the story icon to the whitespace
under the author/submitter info on the left.
Did you design a comment page too?
------
btrask
It's unfortunate that they've jumped on the fixed positioning bandwagon. Non-
scrolling headers/footers break the way Page Up and Page Down work, because
the visible size is no longer the size of the scrollable area.
Luckily there's a user style in the Slashdot comments that changes the fixed
elements to absolute positioning. It isn't a perfect solution, though (e.g.
for when you use an alternate browser).
~~~
kd0amg
_Non-scrolling headers/footers break the way Page Up and Page Down work,
because the visible size is no longer the size of the scrollable area._
It also seems to break their "Parent" links -- it ends up with the top bar
partially covering the comment's header. I've yet to encounter a web site
where floating title bar is a good idea.
------
andresmh
I was going to make a snarky comment like Slash..who? But I clicked and it the
UI looks much better!
~~~
steve918
It definitely looks better, but as with everything new I see on Slashdot and
SourceForge these days it seems a little to late.
~~~
GrooveStomp
It probably looks better, as I remember Slashdot being really ugly, but the
new design is still ugly. The Slashdot header font in particular is an
eyesore, but very little on the site actually looks modern or pleasing.
------
bermanoid
Ugh, all this time, a whole new revamp, and it _still_ suffers from the
problem where hitting "Get more comments" moves everything around, and you've
got to look through everything to figure out what the hell actually just
loaded...
Why would anyone ever think that hitting more should load more comments
_above_ the comment you just finished freaking reading? And yes, I realize
they're loading low ranked deeply nested comments that are underneath ones
that are already visible, but yeesh - just clip the damn comment tree after a
certain number of comments are displayed, FFS, no need to try and be clever
about it...
~~~
wladimir
I might be alone in this, but I actually preferred the old-fashioned
pagination approach to to 'get more articles' buttons. Actually, I might not
be alone in this as HN does the same thing :)
------
zeemonkee
I don't really care so much for design as content. Slashdot looks tired,
whatever design you slap on top of it. For example they've updated the Bill
Gates Borg icon. It looks very dated now, though.
------
benjoffe
These kinds of designs often work badly with flash, as can be seen by the
flash ad on the page: <http://i.imgur.com/qycS8.png>
(flash in most (all?) browsers appears on top of even html regardless of
specific css rules)
~~~
randall
For the record, Flash will respect z-index if you set the wmode param.
<http://www.communitymx.com/content/article.cfm?cid=e5141>
~~~
asnyder
If I remember correctly, setting the wmode param slows down flash performance
significantly. I also remember it having significant trouble in opera.
However, it's possible that these issues are now fixed in the newer browsers,
but I'm pretty sure that older browsers still experience them.
~~~
randall
Def some quirks back in the day (I think FF on Linux still doesn't even listen
to the wmode param, iirc) but I think modern browsers deal with wmode better
now.
------
Duff
This is probably the first good redesign that /. has done this century!
I actually stopped visiting Slashdot because the previous rounds of
"improvement" to the comment system was always a pain in the butt if you had
to switch tabs while typing. It will be good to be back.
------
wazoox
I hope it will work better than the previous one. It was unbearably slow on my
puny work PC and hardly usable on my other machines, and frankly, I don't
think that I should upgrade to quad core because of ONE website.
------
tjansen
I am glad they kept much of the original design, especially the rounded edges
and the green. Yes, it's not pretty, but for me it's like listening to an
oldie station. Slashdot was there in the 90s when you felt like participating
in a revolution for using Linux (or, if you wanted to be even more avantgarde,
one of the BSDs). It helped people get through the time of the Columbine
massacre, was the best place to discuss the Halloween documents, offered
endless First-Post/Nathalie-Portman trolling opportunities, and with Jon Katz
you always had something to talk (or complain) about.
The problem is that the culture did not really evolve. There's nothing that
Slashdot stands for anymore. I wonder whether it would be a good idea for
Slashdot, now that the readers as well as the authors are older, to cover
topics for geeks in their 30s or 40s: homes, families, kids... I believe that
there would be a range of interesting topics, and I don't know any news site
that covers them.
------
yaix
Now it looks like a framed site from good ol' 1997 again. Fixed header, fixed
menu cluttering up your 800x600 pixel smartphone screen, and a small
scrollable area (of which most is white space) with the actual content. Wow,
that went seriously wrong.
------
tzs
It looks uglier than the old version, in my opinion. On the other hand, it
looks like paste now works correctly when entering comments. (It used to often
fail on Webkit-based browsers if there was any text already in the box), so
that's a huge improvement.
------
jasonkostempski
I really thought I'd be saying 'digg-a-what?' before i'd ever say '/?' ever
again.
------
easyfrag
My problem with /. was that the stories were not voted in on by the community
but rather by a select few curators who added editorial content in the
headlines.
And they're still using the Bill Gates Borg icon?
~~~
recoiledsnake
That could easily be called a feature, looking at how Reddit's focus changed.
Hacker News is able to keep it's narrow focus, but it's the exception rather
than the rule.
You're right about the editorial content in the headlines and summary though
but I like that the stories have a small blurb explaining the story unlike
Reddit or Hacker News.
------
invertedlambda
Slashdot? I read HN.
~~~
ubernostrum
Slashdot was here before HN.
It is likely Slashdot will still be here when HN is gone.
There's a lesson in that.
~~~
cabalamat
> _It is likely Slashdot will still be here when HN is gone._
My gut feeling says no. I used to read Slashdot as much as I now read HN or
Reddit. I hardly ever go there any more, and when I do it just doesn't seem
interesting.
For me, Slashdot lacks the buzz it used to have.
------
VMG
Still no Unicode
------
JonnieCache
Same basic design but like half the number of DOM elements. Well done for
doing the right thing :)
------
halo
I like the redesign. Have they fixed the unusably bad commenting system?
------
risotto
This looks really great. Kudos to the designers and programmers.
------
jonhendry
The Apple category seems to be gone from the front page.
------
mahmud
Looks like CRAP in Opera. Sheesh.
------
fleitz
Does the redesign involve removing most of their userbase? There's some kind
of aversion to people charging money on that site that just doesn't jive with
me. I assume that HN will at some point reach the critical mass that killed
slashdot for me. Hopefully that day will be far far in the future.
------
jeberle
Looks like a trainwreck w/ K-Meleon, plus it chews major CPU cycles. Ah, the
web...
------
leon_
I don't like the feel to it. When scrolling it feels sluggish and the screen
seems to flicker.
~~~
kaiwetzel
I noticed the same (i7-920, FF 3.6) - after removing the fixed position bar
(using firebug) the site at least seems significantly less sluggish. Weird how
a seemingly simple thing can hurt performance so much (or maybe it's just a
psychological effect ?)
On a large screen I find it visually more pleasing than the last version but I
stoppped using the site completely after they introduced that silly ajax-abuse
and the new one wouldn't make me come back if for that reason alone.
| {
"pile_set_name": "HackerNews"
} |
Boost Your WiFi Signal Using Only a Beer Can - pwg
http://dsc.discovery.com/gear-gadgets/boost-your-wifi-signal-using-only-a-beer-can.html
======
Mavrik
How does that work on MIMO routers? ;)
| {
"pile_set_name": "HackerNews"
} |
Firefox now 19% market share, eating more of MSIE's lunch - fromedome
http://www.alleyinsider.com/2008/7/mozilla-gets-its-guinness-record-firefox-gains-market-share
======
mattmaroon
After spending the better part of a year working with web design, I can only
pray that this trend continue.
------
bprater
Although impressive, it still has a long way to go before being considered
dominant.
A good question might be: why hasn't it gained more popularity over the last 5
years?
~~~
sdpurtill
Simple answer: big companies haven't switched to FF. If they started
switching, their employees would go home and switch too. Until then, it will
be hard to become a majority.
~~~
lkozma
Simple answer 2: many people don't know or care what's a browser. For them
using the internet is clicking the blue e icon that is there by default.
| {
"pile_set_name": "HackerNews"
} |
Ask HN: Worked for linode? - a_lifters_life
Please share your experience
======
db7a11196
Yes.
Don't.
~~~
a_lifters_life
Why?
| {
"pile_set_name": "HackerNews"
} |
We use too many damn modals (2018) - juancampa
https://modalzmodalzmodalz.com/
======
bedatadriven
I was just about to share this our designer, who is always talking me down
from modals, when I reached the bottom and discovered that our designer is in
fact the author.
I seriously can't say enough for this message. Adrian helped us redesign our
app 2 years ago, dispatching many many modals in the process, and delivering a
huge increase in usability that our customers and users very much _do_ care
about.
~~~
prox
I implore everyone to read the fundamental book : about face
This book argues about a lot of UI topics, and with a lot of detail. It’s the
bible of UI design imo.
~~~
mrgreenfur
What's the book title? "About Face"?
~~~
prox
[https://www.goodreads.com/book/show/289062.About_Face_3](https://www.goodreads.com/book/show/289062.About_Face_3)
This is a link, you can also search “about face interaction design”
------
l0b0
There's a very simple rule of thumb: must the user give some feedback before
it makes any sense to use the rest of the site? If so, it's fine to use a
modal. Otherwise, don't. Some examples:
Payment page: not a modal. The user might want to add more stuff to the cart,
even when they are literally one click away from purchasing. Blanking out the
rest of the page means they now have to work _around_ the site to buy more
stuff.
Error message: not a modal. The user is vanishingly unlikely to report an
error immediately (obviously it's logged in any case), and probably just wants
to try again with slightly different input.
Irreversible action confirmation: not a modal. For the same reason as the
payment page, the user might want to do something else before retrying the
irreversible action. Just display a _prominent_ message so the user knows the
action is not yet done. Or, if at all possible, implement undo instead.
My job depends on it: OK, use a modal. We forgive you.
------
aylmao
I remember reading somewhere that most (modal) dialogs are just lazy design.
It's easier to have a dialog component and throw whatever notice or controls
in there. It's a predictable and flexible canvas.
Your password is incorrect? Error dialog. Easier to design and build that a
warning state on the text-boxes and a message close to the login button.
Writing a post? Editor dialog. It's easier to design and build than something
that's inline, grows to accommodate what you're writing, and works predictably
when you scroll away or click something else.
There's something the user should know? Alert dialog. Much easier than
figuring out where in the component hierarchy you should inject a warning, how
it should look, what controls you should disable, etc.
~~~
elcomet
Maybe it's lazy but the important question should be: is it good design?
I don't think it is, as it hides some information from the main page. But I
think it can be done well and be useful, as said at the end of the article.
After all, OS use a lot of modals.
~~~
grumple
The more important question: do the people paying you care if it’s good
design? Because the people who have paid me thus far in my career have not
cared.
~~~
lucasmullens
Design is one of those things people often don't consciously notice, but
affects their overall impression of a product.
I think there's a pretty universal consensus that a good design matters.
~~~
tobr
A lot of people don’t understand design as something that goes beyond the
surface layer, though. They see that someone picked a pleasing color palette
and layout, but maybe they don’t imagine that a designer would come and tell
them to stop using modals.
------
ameyv
I don't want to complain, website is harder to read with all that 8 bitsh font
type...Maybe normal post with one example for each idea would be good enough.
~~~
kinkrtyavimoodh
The irony of a website with a deliberately terrible and gimmicky UI
complaining about a functional UI model that has worked for decades.
~~~
justanotherc
That's kind of what I thought too. Our app is very Modal heavy (including
modal inception), but I can't think of any case where the suggested
alternatives would actually be better. And that's backed up by the fact that
our users regularly tell us how easy and awesome it is to navigate through it.
~~~
winrid
It's one of those things that when you prototype a design without them you
really feel the difference.
------
systemvoltage
I agree about modals. But seems like the same points can be made about the
website's design that trying to teach others about modals, have you considered
using plain text or a more readable website to convey your point across
without resorting to 1-bit retro decoration?
Edit: 1 bit lol
~~~
post_below
I don't love retro design either but that seems irrelevant.
To anyone who's familiar with the topic it's immediately clear from the copy
that the author understands it.
Considering that the target audience is other designers, why does the
presentation matter?
~~~
buzzerbetrayed
> Considering that the target audience is other designers, why does the
> presentation matter
I fail to see the logic here. Good design helps people learn easier. It's not
like designers have some unique ability to learn equally well no matter how
poor the design is.
~~~
post_below
Well context... the information the author is presenting is simple, right
brain stuff. No one with UI experience is going to have a hard time digesting
it.
It's not like he used gifs or flashing colors.
Now I kinda want gifs and flashing colors.
~~~
gindely
That isn't actually true. I think I have UI experience, but I would actually
have preferred some mockup examples instead of 1 bit bmp pseudo-thumbnails to
represent his ideas. Surely it would be better to show some good user
interfaces.
------
x32n23nr
Modals are basically the fancy clone of old school javascript's alert(), and
while everyone would rightfully be annoyed being "alerted" needlessly,
designers have convinced themselves that it's fine to show modals frequently?
Edit: Claiming it's only designers' fault, or generalizing that all designers
do that, is obviously wrong. I did not mean that. What I meant to convey
(partially failed) is that examples propagate, and somehow design choices are
the beginning of this propagation chain.
~~~
pvg
Modal dialogs are a lot older than alert(). alert() is an instance of a modal
dialog, not the other way round.
~~~
seventh-chord
My impression from seeing old windows version (never having used anything
older than vista) was that in general using an extra window instead of a
separate pane inside the window was more common. As far as I can tell, the old
windows explorer would pop open a new window for each opened folder. Now it
seems like most programs are run in half/full-screen; They arent really
treated as windows. IDEs have their own window-layout system anyways, right?
(Modal) dialogs are probably more at place in a world where windows is more
that just the name of the os.
~~~
pvg
What I was getting at is that the important thing is the concept (which is
about as old as GUIs), the other stuff is implementation detail. The 'modal'
part of the a modal dialog is that it takes you out of the 'modeless'
interaction state where you can take one of the many UI actions available to
you to a state (mode!) where you can do very few things and nothing else.
Making GUIs less modal is a similarly ancient UI design holy grail.
Applications are modes hence everything from OpenDoc to the just-announced App
Clips. A familiar clunky modern mode is 'native' apps vs 'apps' in browsers -
a big chunk of technologies many programmers work with today are directly or
indirectly related to mitigating its effects.
~~~
gindely
Yeah I think nowadays a lot of people think "modal" means "a dom element
positioned with respect to the viewport that prevents interaction with the
rest of the webpage".
The classical sense of "a collection of related controls that prevents
interaction with the rest of the application (however the collection, controls
and application are implemented)" is surprisingly rare now, particularly since
the older sense were often called "dialogs" (and indeed, "modal" was short for
"modal dialog") - intended to emphasise that this was communication between
the user of the application and the developer of the application to decide how
the application should behave.
Therefore, a _modal_ dialog should be used when the communcation couldn't have
happened before and cannot happen later, and the developer of the application
cannot do something intelligent and allow the user to correct it later on.
(For instance, just delete the thing and let them undo it - no interaction
between the developer and the user actually needs to transpire.)
------
recursivedoubts
Let me see if I can articulate this well:
The problem problem with modals in a web page context is that they go against
the statelessness of the html request/response model. They end up building up
a lot of local state in the client, both model state as well as UI state, that
eventually need to be reconciled with the server. If the user closes the
modal, is the data saved for later? Does the form reset? Does the server know
anything about it? If it is a wizard, were previous steps saved? And so on.
After many years of working on and with intercoolerjs/htmx, I now typically
prefer inline editing and wizards, to a modal solutions. It fits better with
the web model, allows for proper URLs, etc.
The inline-edit demo from htmx is a good example of something that might be
implemented as a modal by some developers, but works very well as an inline-
edit UI instead:
[https://htmx.org/examples/click-to-edit/](https://htmx.org/examples/click-to-
edit/)
(NB: I was lazy and did not make the URL update as I should have)
~~~
runawaybottle
I’ll try to condense it more. If you play a video game and a modal pops up, it
takes you out of the experience. It’s the same as (hypothetically) me having
to go into my HN profile to find a list of comments I upvoted to then go ahead
and downvote (versus just having the downvote button next to the post).
Can’t teach this stuff, you basically need to be confronted by tons of bad UI
in software, games, physical printed forms, processes, until you get a good
sense of taste.
I can’t _teach_ HN devs that my thumb covers both the upvote and downvote
button on mobile. Can’t teach it.
~~~
ozaark
> Can’t teach this stuff
There is literally a field of science called Human-Computer Interaction that
teaches this stuff.
~~~
gindely
Yes but have you seen software - almost anything that's been developed since
the growth of mobile platforms has basically done the exact opposite of what
that field of science teaches.
This seems to confirm the op's point.
~~~
ozaark
That fault lies solely with the designers and developers of those software.
"Yeah but they didn't know what they were doing and didn't want to learn" is a
more appropriate take.
You can think of it this way: your neighbor nextdoor made a soapbox car. Does
that qualify them to be your mechanic? Probably not.
------
w_for_wumbo
My biggest issue with modals, is that they never seem to be tested with mobile
devices in mind. So when I'm on my phone, the ability to close it is often off
the screen. The only choice is to leave the website at that point.
~~~
labster
I often find the pattern on websites that I visit the site for the first time
due to a link, read maybe half a paragraph, and then get a modal popup asking
for my email address to subscribe. I've never read any content from them
before, how would I know if I want to subscribe? My phone is also in landscape
mode, so the modal's close button is off the screen, and I can't scroll to it
cause it's an absolutely positioned modal.
So the only logical thing to do is avoid the site in the future, they're more
interested in extracting money from their existing readers than getting new
readers. Sorry, maybe if you had let me read one article I may have liked your
content.
------
ozaark
The key design principle overlooked here is Progressive Disclosure[1], which
modals and dialogs can be very good at delivering.
Progressive disclosure retains user focus on a single task as opposed to
showing everything at once. Accordions have similar function to modals and
dialogs but adding further task controls to an already complex interface isn't
always the best solution.
The author goes on to state that even full screen modals are bad, but what
difference does the user see? If done well the user should still be able to
use the browser back button, escape key, etc to navigate out. In modern
applications, pages can transition from one to the next without a "full page
load" -is that also bad for some reason?
Think of many popular mobile apps like Instacart, Doordash, etc that allow a
user to dive into categories that slide on top of the existing content to give
further controls; is that not ok?
Every element in the DOM can be applied inappropriately but that doesn't shift
the blame to the elements themselves. One could argue an entire dedicated site
that only uses modals based on the misuse of illegible fonts would be about as
apropros.
[1] [https://www.nngroup.com/articles/progressive-
disclosure/](https://www.nngroup.com/articles/progressive-disclosure/)
~~~
andreareina
TFA is not saying all modals are bad, just most. Progressive disclosure is a
fine thing, but that doesn't have to mean modals. Most of the time a discreet
notification that doesn't interrupt the user's flow is what's appropriate; I
think Tidal does a good job here, by highlighting _one_ feature per screen,
and if I don't tap on it to see what it is I'm able to go about my way.
Since we're quoting NNGroup, here's their guidance on modals[1] (emphasis
mine):
1) Important warnings
2) Critical to continuing the __current process __
Most modals are unasked for, not relevant to the user 's current needs (no
matter what the dev/marketing might think), and unwanted. Modal to select
filter settings when I've clicked/tapped the 'filter' button? That's a good
use.
[1] [https://www.nngroup.com/articles/modal-nonmodal-
dialog/](https://www.nngroup.com/articles/modal-nonmodal-dialog/)
~~~
ozaark
TFA literally says don't use full screen modals, ever. I disagreed with
examples of that not being necessarily relevant with the way today's
applications can work.
Important from the NNGroup article you linked even demonstrates great modal
usage and states "3\. Modal dialogs can be used to fragment a complex workflow
into simpler steps." and "4\. Use modal dialogs to ask for information that,
when provided, could significantly lessen users’ work or effort."
That's important - modals are not just intrusive pop-ups as many designers and
others in this thread have decided that they are. Again, every element can be
used inappropriately but that doesn't mean the elements are to blame.
~~~
andreareina
You're right about full screen modals, I've seen those done right and "never
use x" is rarely right all of the time. And modals can be done well, but 90+%
of the time I encounter them, they're not. They're not related to the workflow
I'm currently engaged in, they don't save me any time, they don't serve me.
Again, I'm not saying (and I wouldn't characterize TFA as saying) modals _per
se_ are bad. But the priors are such that absent any other information, modals
are suspect.
------
winrid
I hate modals - I'll never add them to products I build.
Edit: To provide a little more substance here - for example in one product of
mine that actually has customers - confirmation is done via inline elements
that slide into the page.
So you want delete a comment for example. The delete/cancel buttons slide into
the box for that comment. No weird context switch for your eyes.
~~~
gindely
Since you deny a weird context switch for your eyes, are these actually inline
elements? That would imply the size of the containers is changing. Which would
surely be a weird context switch. But at least I can have no doubt what my
action is about to affect - I get frustrated on sites when it says "you're
about to delete these items" and i have no idea whether it agrees with me
about which items I'm about to delete or not.
But if they're overlaid elements, like a context sensitive menu that contains
two options - delete, cancel - is there any substantive difference between a
menu and a modal? The main advantage of a classic menu over a modal seems to
be that it has a implicit cancel option that is uniformally implemented. But
nowadays, menus are implemented without toolkit support - yay dom, yay - so
even that is not consistent.
~~~
winrid
This is what I mean: [http://imgur.com/a/hY9eM77](http://imgur.com/a/hY9eM77)
Content size doesn't change in my case and I would highly advise against
changing the sizing of elements on the page for something like this.
The buttons are sized vertical based on the content up to a limit.
~~~
gindely
Yep okay, so I guess that's not inline by my definition (since it overlays the
text). I have definitely seen webpages that change the size of the containing
box to add in the confirmation message and buttons.
I guess it's an between case I didn't consider - it's maybe not modal with
respect to the whole app, but it seems to be modal with respect to that one
comment (you can ignore it and do something else with the rest of the program,
but with that comment, you can't even read it).
------
quickthrower2
In what context?
If you design a desktop application, like say Word then modals are more
appropriate than the alternatives in many situations. For a site like Medium,
a modal is an annoyance. If I asked for something (click Settings) then I am
happier with my modal than if I didn't ask (we're going to stuff cookies on
your page anyway, here is our arse coverer).
The underlying message I take is we "use too many things that make it easier
for the coder and harder for the user". Sometimes that's a modal, but
sometimes that's NOT using a modal!
------
js8
Typing this from Ubuntu, Gnome 3 I think, in Firefox, I press Ctrl-S to save
the page, and I get a modal dialog, which cannot be even moved. Sometimes, it
actually obscures something that I cannot read on the screen.
If you have to use a modal element at least allow me to get it out of the way
so that I can read the text under it!
~~~
yencabulator
Every time this happens in any environment, the information I need for
deciding the filename is below the modal.
------
Kaze404
This is only tangentially related, but one thing I'm absolutely sick of is
rounded corners. I don't know why, and I don't know if it makes any sense, but
opening a page and seeing literally _everything_ with border-radius makes my
blood boil. It feels like they're trying to protect me from sharp corners, as
if they're dangerous. Come on, we're all adults here. We can take a few 90
degree angles.
~~~
gindely
> Come on, we're all adults here. We can take a few 90 degree angles.
I know that 90 degree angles on webpages are not more dangerous for children,
parents, adults, or any other known subcategory of humans than for any other
known subcategory of humans, but ... lots of people who use webpages and dev
tools are actually children.
Didn't a lot of us get started as kids?
Have we collectively forgotten the olden days, when to print something, the
solution was to ask the neighbor's next door kid? (who was probably you).
Printers aren't any easier to use today than they were back then, but I guess
that practice is over nowadays.
~~~
Kaze404
Fair enough, but some of the worst offenders of this aren't really targeted at
children or something they'd be interested in. The new Github UI (which I
adore, besides this particular issue) is a really bad offender for example. If
you go to your profile page, there's not a single not-round border. The
exception are the little squares showing activity, which makes it even worse
because in the caption underneath they _are_ rounded. What happened there?
------
seph-reed
Modals are a good way to show complex content without throwing away your
current render. Whatever you were doing before the modal came up, you can go
straight back to that. No re-draws, no scrolling back to the place you were
at.
Ofc we've all seen them done terribly.
------
justanotherc
Modals are good for when you need to show a new blocking state without losing
the current state. There are many valid reasons for that scenario. Even for
modal inception.
Use them for that purpose, no need to swear them off completely.
------
grishka
The one UX anti-pattern that needs to die for sure is what this site calls
"self-spawning modals". They're not only on websites — iOS is very annoying
with them for example.
------
qntmfred
I've been preaching against modals and carousels for years. I'm a big fan of
[http://shouldiuseacarousel.com/](http://shouldiuseacarousel.com/) and while I
had long ago intended to create with a similarly implemented site that
demonstrates the pitfalls of modals with a series of modals, this site does a
pretty good job of highlighting the main issues and alternatives too
~~~
klyrs
Wow, that's almost useable when you don't have to wait for a new ad to load
with every click...
------
davedx
Hah! I often have the experience that indeed a modal should actually have been
a new page, but I was too lazy to build the thing so I went with the modal.
Then later the modal needs more options or fields, and I end up having to
convert it to a new page anyway... lesson mostly learned now. Agree with the
article.
------
saagarjha
One thing I like to do is preview a link by Force clicking on it in my browser
and scrolling through it–the preview window lets me scroll, but not interact
with the page. For many sites this is quite convenient, because I can skim the
page without being interrupted. But when modals pop up they cover the content
and I can't read the page because I can't dismiss them! So if you're looking
for another reason to drop modals, here's one.
Also, hilariously this website uncovers a bug in Safari where the cursor
doesn't reset once you leave the page bounds, so I have a huge pixellated
cursor hanging around until I click somewhere else :P
~~~
gindely
What does that mean? Force clicking? You apply so much pressue that it breaks
a touchscreen/touchpad so that it makes a click noise when you tap?
~~~
saagarjha
In case that wasn’t a joke: [https://support.apple.com/en-
us/HT204352](https://support.apple.com/en-us/HT204352)
~~~
gindely
Oh, another of Apple's hidden features. Once upon a time, one button mice were
a feature so that features weren't hidden and even beginner users could
quickly become power users.
~~~
saagarjha
This used to be three-finger tap on older Macs.
------
mlonkibjuyhv
There are so many bad modals in mainstream desktop software. Is there any good
reason that an open file dialog completely disables all other interaction?
What if I quickly have to task switch, or a piece of info relating to which
file i wanna open is just off screen?
A personal favourite is osx Preview and the rename and move file drawer you
can get from the status bar of an open file. That guy can't even be moved, and
perfectly covers up the portion of a document likely to contain info relevant
to a filename.
------
tbirdny
I'm reminded of a quote in Inside Macintosh Volume 1: "But, gentlemen, you
overdo the mode." From John Dryden, The Assignation, or Love in a Nunnery,
1972. That quote really stuck in my head.
------
jaequery
I must be in the minority as I love modals.
------
nameoda
The list of reasons why modals are bad are all very subjective. A little more
explanation would have helped make the case that modals are bad.
Not to mention that what should be a list is displayed as a table (-‸ლ)
~~~
diegof79
I agree, the site visuals and short messages doesn’t explain well why they are
bad.
I’ll try to fill that gap.
Modals switch the application normal mode to get your attention. So they
interrupt your “flow”.
Most of the reasons to use them could be resolved by other means, that are
better in terms of UX but requires more work in both design and
implementation:
\- Confirmation modals can be substituted with auto-save, undo, and restore,
but is more complex to implement.
\- Modals to show more info can be resolved with progressive disclosure or by
improving the information architecture.
However, like with many design choices there are cases where modals are a good
option. For example, for certain confirmations and navigation the advantage of
the modal is that the backdrop makes clear where do you go after dismissing
the modal (bottom sheets in mobile are a example).
Another reason why modals get a bad rep is that they have all sorts of
implementation issues in web, and in the desktop the mode change for a window
is not prominent (macOS solves that with sheets, but Windows still has that
problem)
~~~
Enginerrrd
They also break zoom and other features some people rely on a lot.
------
slezyr
The website is readable only at 30% zoom(couldn't zoom out even more) on the
4k 27" display also it messed up mine scroll wheel in some way, broke mouse
cursor.
------
627467
Modals can be a lazy (and worst, wrong) solution. But I've found that
pavlovian bashing of modals is as lazy and unproductive.
> And remember to always ask, kids: >“Why does this have to be a modal?”
In my experience people don't need to be remembered to ask this. And generally
people can't justify why it shouldn't be a modal or why a non-modal solution
is better than a modal one.
------
ascotan
Back when js frameworks became a thing, modals were a shiny ball that everyone
wanted to put into their app (because they came with the framework). Designers
that worked with these modals became poisoned to the fact that this is how a
UI should look. Like an old hairstyle they're now thankfully going out of
fashion.
------
discordance
I've been using Azure a lot, and they use blades (essentially horizontal
accordions) instead of modals.
I was playing with Alibaba Cloud's UI and it's all modals and feels much much
nicer to use. I wouldn't use Ali Cloud due to Chinese ownership issues, but I
do like their UX.
------
imtringued
Eclipse uses modals in the worst possible way. Usually you want to copy the
name of a type or variable into a modal but you are already into two layers of
modals. You will have to close all of them and lose progress. Using more than
one layer of modals is always bad design.
------
farouqaldori
The awful readability of that website must be satire, right?
------
Areading314
The reason modals are so popular is because they work really well when trying
to convert views to signups.
------
pablosca
Love this a lot! The author is right on most (well, all) of the statements.
------
renewiltord
Very easy to use a modal if I’m editing other parts of UI.
------
jevgeni
I have a rule: if I get an unnecessary modal, I leave.
------
jerry40
Oh I thought the subject is about the modal verbs
------
triyambakam
Is this website considered "brutalist"?
~~~
WorldMaker
In the web sense of "true to web form" brutalist? Not really, for instance the
tables are not made to look like TABLEs and are in fact ULs and such. This
design aesthetic might better be described as "(Mac) Classical". (Brutalist
isn't only about minimalism.)
------
Kagerjay
the worst thing is doing a modal that calls a modal, that's the worst offender
of them all
------
6510
also see: popups
------
supernova87a
My main gripe with modal windows is the name. The first couple times some
developer told me that's what she was using, I waited for her to explain what
modal meant. But she did not, as if it was standard self-explanatory
terminology. Way to name something a word that sounds like it has a standard
(even deeper) meaning, but does not.
With time, I have just accepted that this is what modal means (I'm going to
create a window of this type to display the data/filters), disconnected from
any meaning of the term "mode".
But I still don't like it.
~~~
aasasd
It's a term from the UI design world, and it goes way back. A mode is a state
where the app accepts certain inputs from you, the user. A new mode means
previous buttons and shortcuts are inaccessible, and the same input may
produce different results. Either coincidentally or directly related, Vim's
modes illustrate the idea well—in contrast to editors where input, motion and
shortcuts for extra functions are available together.
[https://en.wikipedia.org/wiki/Mode_(user_interface)](https://en.wikipedia.org/wiki/Mode_\(user_interface\))
and
[https://en.wikipedia.org/wiki/Modal_window](https://en.wikipedia.org/wiki/Modal_window)
The trouble with elaborate modes is that the user has to mentally keep track
of which mode they're in (which is what the term describes); and that other
functions are inaccessible even though they might be useful to resolve the
situation. However, modal dialogs are a streamlined and visually distinct
extreme of modes, with an established history, so some don't consider them
problematic. (Modes that are difficult to notice are still a no-no.) OTOH
dialogs are frequently used as cheap bail-out by programmers, especially in
desktop interfaces—since they're very easy to produce, completely synchronous,
and shift all problems onto the user.
~~~
supernova87a
Thanks for that explanation.
I guess I just have to live with it when I pop in and out of the UI coding
world. Just like in Python a .method isn't what the natural word typically
means, but just has gotten a domain-specific meaning attached to it in that
world.
| {
"pile_set_name": "HackerNews"
} |
Quiz: How's your maths? I got 10/10 - RiderOfGiraffes
http://news.bbc.co.uk/1/hi/magazine/3513187.stm
======
rsheridan6
That was too easy for HN. I was expecting something I didn't know in seventh
grade - and I'm no math genius, not by a long shot.
------
RiderOfGiraffes
Following up from the English quiz,
<http://news.ycombinator.com/item?id=463745>
here's an older quiz. It's mostly arithmetic, with some "general knowedge"
sort of stuff.
There's apparently another "maths" quiz coming next week.
Please don't post spoilers.
------
sharkbrainguy
I call cultural bias on 5 and 6 (aka the two I got wrong).
~~~
RiderOfGiraffes
Absolutely. And number 1 is knowledge, not mathematics.
There's an interesting discussion to have here over what is, and is not,
mathematics (versus, say, arithmetic), and just how much "real mathematics"
hackers should know.
| {
"pile_set_name": "HackerNews"
} |
Care about H1B quotas? Call your congressman - maalyex
http://www.contactingthecongress.org/
======
maalyex
The grass roots tech community has had success in the past shaping
legislation. Whatever your thoughts are on H1B quotas, let your elected
representatives know.
| {
"pile_set_name": "HackerNews"
} |
Add Undo and Redo to Your Web Application With Cappuccino - dawie
http://www.thinkvitamin.com/features/ajax/add-undo-and-redo-to-your-web-application-with-cappuccino
======
thomasmallen
Is anyone here using this framework?
| {
"pile_set_name": "HackerNews"
} |
F8 Facebook Developer Conference - djug
https://fbf8.com/
======
jpgvm
I don't know what it is but I really like the design of this site.
That out of the way it will be interesting to see what comes out of F8 this
year, specifically in relation to Parse - I would be expecting big
announcements especially considering after Google acquired FireBase earlier
this year.
| {
"pile_set_name": "HackerNews"
} |
Memory by the Slab: The Tale of Bonwick's Slab Allocator [video] - snw
http://paperswelove.org/2015/video/ryan-zezeski-memory-by-the-slab/
======
bcantrill
This paper brings back many great memories -- it's one of those that I can
remember the cafe I was in when I first read it -- and Ryan has done an
excellent job capturing its importance and influence. Speaking personally, the
influence Bonwick's work had is substantial: it was in part as a result of
being inspired by this paper that I ended up working with Jeff for the next
decade on systems software.
In terms of follow-on work, Ryan mentioned the later libumem work[1], but it's
also worth mentioning Robert Mustacchi's work in 2012 on per-thread caching in
libumem[2]. And speaking for myself, I am indebted to the slab allocator for
my work on postmortem object type identification[3] and on postmortem memory
leak detection; both techniques very much relied upon the implementation of
the slab allocator for their efficacy.
Thanks to Ryan for bringing broader attention to a terrific systems paper --
and truly one that I personally love!
[1]
[https://www.usenix.org/legacy/event/usenix01/full_papers/bon...](https://www.usenix.org/legacy/event/usenix01/full_papers/bonwick/bonwick_html/)
[2] [http://dtrace.org/blogs/rm/2012/07/16/per-thread-caching-
in-...](http://dtrace.org/blogs/rm/2012/07/16/per-thread-caching-in-libumem/)
[3]
[http://arxiv.org/pdf/cs/0309037v1.pdf](http://arxiv.org/pdf/cs/0309037v1.pdf)
------
tkinom
With C/C++, one can get A LOT OF performance by writing custom memory
allocator that fit certain usage patterns for large scale App.
I designed a custom allocator before new/delete operator overload for C++ OO
app. You can think of the app like MS Word, when you open/create a new doc,
one need a lot of malloc(). In my case it usually between a few millions to a
few tens/hundreds billion records.
There were a lot of overhead for standard new/delete. After profiling, I end
up writing my own allocator with the following property:
* It malloc 1,2,4,8,16,32,64MB at a time. (progressively increase to optimize the app RAM footprint for small and large doc use case.
* All the large block alloc()s are associated with the "Doc/DB".
* When the Doc close, the only freeing a few large block are needed. This change make the doc/db close operations go from 30+ seconds for large Doc/DB to less than 1 seconds.
* I later modified the allocate to get the large block memory directly from a mmap() call. All the memory return are automatically persistent. The save operation also went from 30+seconds for large multiple GB DB to < 1 seconds. (Just close the file and the OS handle all the flushing, etc.)
Without ability to customize memory allocator + pointer manipulation, I can't
figure how to get similar performance for similar type of large scale app with
Golang, Java, etc.
------
mwcampbell
Perhaps this is naive, but it seems to me that a general-purpose allocator, as
Ryan defines it, is a solution to a problem that doesn't have to exist in
principle. The allocator just needs to be able to move memory blocks around,
so it can compact allocated memory. Probably the most natural way to enable
this in C-ish languages is to use relocatable handles, as implemented by the
original Macintosh memory manager and the Windows GlobalAlloc function (when
using the GHND flag). In this scheme, the allocator doesn't return a pointer,
but a handle, which has to be locked to retrieve a pointer whenever that piece
of memory is being used. As long as most handles are unlocked most of the
time, the memory manager is free to move blocks of memory around when
necessary. In Ryan's parking-lot analogy, it's as if all parked cars could be
moved around the lot at any time to avoid fragmentation. So I wonder if the
inventor of libc's malloc was aware of relocatable handles and simply chose
not to use them. Are there good reasons why we don't use relocatable handles
in C/C++ today? Or is it just the weight of arbitrary history?
| {
"pile_set_name": "HackerNews"
} |
What are potential disadvantages of functional programming? - tosh
https://www.reddit.com/r/compsci/comments/6pnq7e/what_are_potential_disadvantages_of_functional/
======
surement
Biggest disadvantage for me comes when trying to implement a complex algorithm
where the paper uses pseudocode that's imperative (which is almost always). In
the best case you can implement parts of it functionally but when it comes
time to put these together you will likely use some kind of loop, but that's
not possible in "purely" functional languages.
Otherwise -- and in particular when coming up with algorithms -- I find it
preferable for languages to be efficient with functional idioms (recursion,
map, filter, etc.).
Edit: I understand that anything's possible in a functional language that's
possible in other languages. My point is that conversion from imperative to
functional can be complex and might not be worth the trouble, and that doing
it purely functionally might defeat the purpose: it will be nearly impossible
to read or incredibly inefficient.
~~~
eropple
If what you care about is "everything is a beautifully curried function that
actually doesn't do anything", the concern about loops is definitely a thing
(but any loop can be turned into a recursive function so long as your stack is
big enough, even if they're usually hideous). But pretty much everybody I've
ever worked with considers functional purity in the logic layer to be
"functional programming" at its most useful (Gary Berhardt's functional-core,
imperative-shell talk is a good one). Is it easier to write with a loop? _Then
use a loop!_ If that loop is contained in a pure function that has no side
effects and encapsulates something that isn't the most beautiful crystalline
code formation anybody ever did see, _nobody cares_.
It's the mindset, not the implementation, that actually makes functional
programming so valuable a tool.
~~~
surement
> Is it easier to write with a loop? Then use a loop!
Yes, but, say, in Haskell, that's not an option.
It's useful to note that Haskell is usually touted as efficient, and lack of
efficiency is a #1 concern around functional programming. That concern I
personally find revolting: Haskell is slower than C, yes, but orders of
magnitude faster than many widely used languages.
~~~
nilved
> Yes, but, say, in Haskell, that's not an option.
forM_ [1, 2, 3, 4] $ \n ->
print n
------
amalcon
One of my professors used to say that functional programming makes all
problems equally hard. There's some truth to that. For example, for I/O, most
functional languages fall back on imperative idioms. Those that don't
(Haskell, mainly) basically tell beginners to pretend that this weird thing
(I/O monad) is just an imperative code block. You can figure out how it works
once you're more familiar with the language.
Since doing I/O is a pretty common task, functional languages generally
include imperative constructs to fix this, but these constructs generally feel
tacked on.
The main drawback I see in practice is that functional programming can
encourage people to be more clever than they ought to be. In the worst case,
you can get some of the worst Java-like issues, where there's a function that
returns a function that returns another function that actually does what you
want, even if it's only used in two places anyway. In a more typical case,
you'll get someone who likes to unnecessarily write everything in
continuation-passing style, or jumps through hoops to move recursive calls
into tail position even though the recursive depth never exceeds 5.
If the team has the discipline to just avoid that kind of thing, the only real
downside I see is being farther from the hardware.
~~~
pdimitar
People can shoot themselves in the foot with practically any language and
framework. I'd be hard-pressed to call that a FP-specific problem.
When I am writing Elixir, all the lessons I learned from Martin Fowler and
Kent Beck on refactoring became 50x more useful; I don't go 5 levels deep into
making my code uber clever. I recognize these 5 levels but I only do 1 level
and stop there.
It's very important to get clever only when there is a valid reason to do so.
Many colleagues very easily forget that this is a job and you shouldn't get
carried away; do what you must, do it very well so that your future self
doesn't hate you -- but don't try to do it as if your code is gonna persist
for a millennia on a space probe.
------
banachtarski
Inability to practice hardware sympathy is the biggest drawback for me. With
imperative code, my intuition can be useful in estimating what instructions
the compiler is likely to generate (what loops get unrolled, how registers get
packed, etc). It's still not easy mind you, but with functional languages, the
generated code goes through another battery of passes and abstractions that
obscure this even more.
~~~
wyc
If you go deeper, then we start to see a lot of similarities. Functional
programming strongly resembles circuit logic. Maybe a strange loop?
See also [http://www.clash-lang.org/](http://www.clash-lang.org/)
~~~
banachtarski
I've coded in vhdl and verilog. Working at the gate level is definitely more
functional in nature and has the same problem (it's not obvious intuitive how
the code you write maps to gate logic). Hence, why novices introduces
accidental latches and metastability. I've actually used clash for one non-
trivial project, and there's something interesting there for sure.
------
benjaminjackman
I personally much prefer functional programming, however it has some
downsides.
The biggest practical hindrances I have found are:
1\. Performance due to things like allocations due to immutability thrashing
cache, or a language that allocates on calling into closures, or fails to
inline them properly, can be significantly worse depending on the environment
you are writing your code in.
2\. It's less common therefore there is a higher chance other programmers are
going to struggle with your code, or that libraries aren't going to be as well
supported.
Take lodash/fp for example. Vanilla lodash has really good @types support in
something like Typescript. lodash/fp it's an open ticket (last I checked).
This type of stuff happens a lot there just isn't the same number of folks
writing code in fp style so you don't have the same level of long tail
support.
3\. This is maybe a less common view but in an object oriented style with
code-complete, usually discover-ability is better. Typically the IDE has a
better context of what you could be doing when chain method calls with a .
rather than having the whole world of methods at your disposal when going for
a code complete. For example when writing "abc".r<TAB> compared to r<TAB> the
IDE is going to have significantly narrowed down scope to suggest in (methods
that are legal on string). This can be very useful and I suspect narrows the
breadth of the space the programmer must conceptually keep in there head from
token to token. It's big reason why things like extension methods, which
really don't have to exist (they could easily just be static helper methods)
are frequently a highly requested addition to a language.
4\. Somewhat related, and there are ways around this, I prefer a threaded /
piping style as opposed to composing methods together. e.g. `(1 + 1) / 2` or
`(->> 1, (+ 1), (/ 2))` as opposed to `(/ (+ 1 1) 2)` To me I feel the code
reads better from left to right as opposed to inside-out. But that is possibly
(probably?) a result of being an imperative programmer for so long before
switching to functional programming. Also I feel like I am playing around with
a lot more parens matching and jumping backwards for something like f(g(h(x)))
as opposed to h(x).g().f(). Again this could be my imperative programming
upbringing leaking through though.
~~~
guelo
Very much agree with your point 3. It's really a namespace issue with classes
providing a natural division of the namespace. Also OO best practices over
time have encouraged more modularization and encapsulation with heavier use of
restricted visibility. At least in my experience functional code tends to
pollute the global namespace with a bunch of not so obvious functions.
~~~
TheNoseReaper
It depends on the language but you often have the option of grouping functions
accepting a specific input type in a dedicated module, and using them
qualified.
------
moron4hire
It's surprising to me to see how few discussions of Functional and/or Object
Oriented programming end up even mentioning the Expression Problem:
[http://wiki.c2.com/?ExpressionProblem](http://wiki.c2.com/?ExpressionProblem)
The expression problem is a new name for an old problem.
The goal is to define a datatype by cases, where one can
add new cases to the datatype and new functions over the
datatype, without recompiling existing code, and while
retaining static type safety (e.g., no casts).
FP and OOP are two axes in a space of problems around building and extending
programs. To be "purely functional" or "strictly OO" is basically painting
yourself into a corner that you will eventually learn to regret.
Eli Bendersky wrote an excellent piece on it:
[http://eli.thegreenplace.net/2016/the-expression-problem-
and...](http://eli.thegreenplace.net/2016/the-expression-problem-and-its-
solutions/)
~~~
anaphor
I was planning on mentioning it as soon as I saw this post as well. I think
most people who try using a functional programming language that encourages
you to use sum types (at least when you're beginning) run into this in some
way, but most people probably aren't aware that it has an actual name.
Here's a post on solving it using type classes in Haskell, if people are
curious! [http://koerbitz.me/posts/Solving-the-Expression-Problem-
in-H...](http://koerbitz.me/posts/Solving-the-Expression-Problem-in-Haskell-
and-Java.html)
~~~
jfoutz
I was going to link the same article!
I think the interesting part is the similarity between haskell and clojure.
Essetially programming to very small classes/protocols. The linked article is
cool, because i think the java approach would generalize to the OP's C++
solution, and be able to eliminate that gross dynamic cast.
------
contingencies
_Immutability: The property of functional programmers that prevents them from
shutting up about pure functional programming._ \- @raganwald
_Computation is not just about functions. If computation were about functions
then quicksort and bubble-sort were the same because they 're computing the
same function. As I said a computing device is something that goes through a
sequence of states and what an assignment statement is doing is it is telling
you here is a new state, and also there's the notion of it's non-determinism,
so the new state is not a function of the old state. So functional programming
in a sense - functions - don't solve the problem of programming._ \- Leslie
Lamport
Perhaps functional programming is best for S-Programs and P-Programs...
_S-Programs are programs whose function is formally defined by and derivable
from a specification. P-Programs [are problems] that can be precisely
formulated but whose solution must inevitably reflect an approximation of the
real world. E-Programs are inherently even more change prone. They are
programs that mechanize a human or societal activity._ \- Meir M. Lehman,
Proceedings of the IEEE, Vol. 68, No. 9, September 1980.
... from my fortune clone @
[http://github.com/globalcitizen/taoup](http://github.com/globalcitizen/taoup)
~~~
charlysl
In the same spirit:
_The nice thing about declarative programming is that you can write a
specification and run it as a program. The nasty thing about declarative
programming is that some clear specifications make incredibly bad programs.
The hope of declarative programming is that you can move a specification to a
reasonable program without leaving the language._ \- _The Craft of Prolog_ ,
Richard O'Keefe (1990) (copied from CTM)
------
0xbear
The main disadvantage is that the next guy/gal will have to rewrite it all in
Java - the language he/she actually can read.
~~~
blain_the_train
And then the next person will re write it in python. It would faster and less
costly to just learn the language...
~~~
0xbear
Rewriting a large codebase in Python (or any dynamically typed language) would
be a mistake.
~~~
blain_the_train
Thats true, if you wanted a _large_ codebase you should stick with Java.
Only half joking.
I like to think the goal is to create a working system. I haven't seen
sufficient evidence that Static Typed languages as a category are more likely
to produce working systems then Dynamically Typed ones. Feel free to link
anything you have on the subject. I would take the time to read meta studies
on the subject though, many of which seem to conclude their isn't much
evidence either way.
I do believe that "Discovered" languages are more capable then "Invented"
ones. Phillip talks a bit about the difference here:
[https://www.youtube.com/watch?v=V10hzjgoklA&t=2176s](https://www.youtube.com/watch?v=V10hzjgoklA&t=2176s)
------
warcher
Optimizing runtime performance can be a little tricky and nonintuitive.
Debugging can similarly be tricky-- inspecting internal return values is not
super fun.
------
lz400
For me one of the biggest hurdles is using immutable data structures all the
time. In the real world you want to add to a list or change a map's value.
Typically functional programming languages use linked lists to make h::t
efficient but if you need performance characteristics of arrays and good
performance/low GC you are going to have a lot impedance mismatch with the
typical functional programming patterns based on built in lists.
~~~
eyelidlessness
> In the real world you want to add to a list or change a map's value.
Having spent a good amount of time doing production FP, and then taking what I
learned into imperative work, I rarely find myself doing this. In the real
world, what we usually want is the _result_ of adding to a list or changing a
value in a map. But also in the real world, we rarely want to consider whether
we've affected other things that supplied that list or map (which they
generally would prefer we did not do).
> Typically functional programming languages use linked lists to make h::t
> efficient but if you need performance characteristics of arrays and good
> performance/low GC you are going to have a lot impedance mismatch with the
> typical functional programming patterns based on built in lists.
I can't speak for the wide variety of FP languages out there, but it sounds
like you've been exposed to languages with a poor set of data structures.
Clojure, for example, provides a wealth of great immutable/persistent data
structures that provide the semantics you'd want and perform really well.
~~~
lz400
I've done a bit of functional programming too and don't get me wrong, I love
the FP cultural victory of Java/C# etc. getting immensely useful functions
like map, reduce, filter, lambdas and higher order functions. But if you are
going to use purely immutable data structures you're going to make trade offs.
In the kind of work that I do generating excess garbage or using extra memory
can be unacceptable so I think I'd struggle with that but I can still apply a
good % of FP principles with mutable structures of course.
If you read introductory functional programming material, you will agree they
are full of h::t style pattern matching and default list types in ocaml,
haskell and scala are simple linked lists (I sadly haven't tried Clojure yet).
I'm open to the possibility of advanced patterns with other data structures
overcoming some of these problems that I'm not very familiar with but that's
what I mean with impedance mismatch, you will need to step away a bit from the
clean elegant patterns of "everything is recursion+pattern matching on lists"
of typical FP teaching.
~~~
gizmo686
And if you do any serious functional programming, you will quickly learn that
using these lists as a data structure is almost always a mistake. A mistake
that happens to be encouraged by the language, but that is largely a historic
wart of languages that are decades old at this point.
~~~
lz400
Sure, agreed, but even if you use more sophisticated immutable structures you
either accept the performance/memory trade offs or you have to fall back on
mutable structures.
~~~
tome
But that's quite a different claim from your original one!
~~~
lz400
Maybe it helps if I explain better. I find it difficult to use immutable
structures if I want certain performance characteristics. It's possible I need
more training in certain techniques, but I think that's why I find it
difficult, the default data structures and introductory literature seems to be
too naive. And in the end, no matter what I use, there will be certain trade
offs for purely immutable structures, and some stuff I just think it's easier
to think about in mutable terms.
~~~
tome
OK, that's a reasonable position. It is easy to read your original claim as
"All real world programming requires mutable values", and that is certainly
not a reasonable position.
------
yen223
Functional languages give powerful expressions for proving code correctness.
Imperative languages provide good semantics for reasoning about performance. I
don't see them as rivals, rather as two paradigms operating on different
domains.
Object-oriented languages on the other hand....
------
throwaway7645
Immutability can have performance issues I believe...not sure you'd use it to
write a game engine for a modern FPS.
~~~
eyelidlessness
Persistent data structures can generally overcome a lot of the performance
issues with immutability, but generally at the cost of memory.
~~~
warcher
Ehhhhhh hard ass real time stuff like heavy dsp or a game engine, you're on
the metal and you're pitching your code to the CPU, not to humans. And the CPU
thinks (kind of) imperatively. And the GPU does its own thing which is
basically neither, haha.
Stuff like web apps, or honestly, the vast, vast majority of programming,
functional is just fine. The deep stuff is gonna stay imperative, but that's a
small fraction of the work that needs doing out there.
~~~
throwaway7645
That sums up my thoughts better.
------
whateveracct
I feel that Haskell's learning curve comes with the benefit that the curve
actually goes in a hyper-productive direction. I have ~3 years of professional
experience but with Haskell I've been able to create robust and extensible
systems in deterministic amounts of time on a consistent basis, often working
alone or with other "junior" devs. I think this is because I've spent the last
~6 years constantly learning FP/Haskell, and while it's been hard, every new
thing I learn pushes me to be a more self-sufficient, capable engineer. If I
weren't so lucky and didn't get the opportunity to __only __work in pure FP in
Haskell /Scala from my first job out of college onwards, I don't think I would
be as capable as I am now.
------
mschuster91
Well I have one: next to no one can read/understand functional programming
stuff, much less write.
Yeah it might be hyped in academia and in high performance/scaled systems but
the average non-academic coder will never interact with FP code.
~~~
seanwilson
That's because you've learned imperative coding first. If you learned
functional coding first you'd probably find imperative coding complex. I'd
certainly rather code in OCaml than C.
------
daxfohl
Thinking that any variant of "functional programming" stands for all variants?
For me, moving to F# from C#, and then trying to get my team to do likewise,
it's far more like "this encourages good CS101 practices" than anything else.
The only disadvantage is the occasional CPU burp (usually easily resolved).
To reddit commentor, I'll agree a lot of OO ideals are baked into GUI, and are
actually quite good for it (decorator pattern, etc). Though a lot of GUI
(winforms binding, wpf mvvm (sorry)) conventions paint you into a bit of a
corner.
------
jack9
Is there any example of a usable Graphical UI written in a functional
language?
I've seen something very minimal in erlang, but it was so resource intensive
it barely functioned at all.
~~~
davidcox
How about xmonad? [http://xmonad.org](http://xmonad.org)
~~~
vacri
A colleague of mine who wants to play with haskell tried it out and threw it
in the too-hard basket - apparently to configure it you write haskell in the
config files; not where he wanted to be trying his hand at the language.
------
a-saleh
It depends how deep you want to go. I would say that 'functional' can mean
different thing to different people, but usually I see it as methodical use of
certain programming features over others.
1\. Language supports functions as values, anonymous functions, has higher
order functions, and ability for functions to access scope in closure. At this
base level there are no disadvantages to prefer using these, because they
allow some nice abstractions (as evidenced by many functional libs in js.)
2\. using recursion over loops, can be a good fit for expressing some
algorithms. Often requires some support for optimisation in language (i.e.
tco, so that your stack doesn't explode. In my experience you'd use higher
order functions like map, fold/reduce most of the time anyway, instead of
dealing with loops/recursion directly.
3\. pure functions. This is the first interesting trade-of in functional
programming, how much should you use functions that only depend on its inputs
to provide its output (and for the same input output would stay the same).
There are clear advantages, you can test these well, understand them in
isolation, refactor them easier, memoization is almost for free.
On the other hand, only using pure functions would limit the usefulness of the
language (it is said that first version of haskell was only able to do a
transformation from stdin to stdout, because they had just pure functions)
Languages that are big on being 'pure' usually allow you to make impure
functions, but then track their usage in the type-system. This was one of
those things, that I often think is more trouble than is worth.
4\. immutable data-structures, which come in handy when you want to create
pure functions that would rather return a copy with modifications, than modify
in-place (because that might have unintended consequences elsewhere) It takes
some time to get used to these, but their performance is on par with regular
datastructures nowdays.
5\. lazyness, where you compute only when necessary. Can make certain programs
more efficient and easier to comprehend, others harder, i.m.o
6\. no global state.
7\. comprehensive type system is sometimes viewed as necessary for real
functional language (how else would you check you don't do side-effects in
wrong places)
For me, 1., 3., 4. and 6. are those I consider important and with little
drawbacks to try to use them 80-90% of time.
------
seanwilson
The big one for me is a practical one: the current ecosystems around them. I'd
love to code mobile apps and web apps in OCaml but it would just be going
against the grain too much for it to be worth it in the vast majority of cases
I think. Can you easily deploy on most cloud services? Are there good
libraries you can use? Is there a community that can help you when you get
stuck?
It's not a fault of functional languages but that is the issue, it's that the
vast majority of coders and ecosystems are entrenched in imperative languages
still.
I'll use strong static typing where I can though e.g. TypeScript is decent in
terms of practicality.
~~~
warcher
I believe clojure's made a lot of progress porting to android's JVM. Last I
heard the clojure runtime took a couple seconds to boot, which is shitty if
you're real serious about your apps, but not a dealbreaker. And the last time
I messed with it was a couple years ago, I would hope to see improvements in
that amount of time.
~~~
fulafel
For Clojure it seems React Native + ClojureScript is the more popular option.
It has good startup performance and works also on iOS:
[http://cljsrn.org/](http://cljsrn.org/)
------
haskellandchill
Functional Programming is a big tent, too big to be very useful. All good
programming provides some proof of correctness, through test coverage, the
type system, or a bit of reasoning done to the side (worst case). Ease of
reasoning from FP in the large is not very valuable but does eliminate some
cognitive overhead. There are no disadvantages to higher order functions and
referential transparency except those we impose on ourselves with our choice
of hardware. Type systems are where the meat is at, and a good type system
clearly states the advantages and disadvantages in its design.
~~~
seanmcdirmid
Higher-order functions and indirection in general can lead to increased
cognitive overhead. Try reading or debugging a deep points free functional
expression and it's obvious there are limits.
~~~
darkkindness
In my own (limited) experience, I'd much rather debug a nasty pointfree
expression than debug a deep nest of loops keeping state in mind.
Perhaps it is just different kinds of cognitive overhead.
~~~
seanmcdirmid
How would you even go about stepping through a pointsfree expression though?
Have Haskell/scala debuggers improved significantly in this regard?
~~~
darkkindness
I wasn't aware these debuggers existed. I'll have to check them out.
The purpose of writing point-free functions is to encourage thinking about
chaining functions together instead of considering what really happens to low-
level values. It's like using pipelines. This means that for debugging,
"stepping through" wouldn't help you much IMO even if you could do it![0] (A
whiteboard might help, though.)
I'll try to give an example: if you're looking at an unnecessarily point-free
expression like "mapReduce = (. map) . (.) . foldr1", the logic is difficult
to read. I'm not an expert in point-free-fu yet, so in that case I'd recommend
pointful[1], which deobfuscates that expression to "mapReduce x x0 x2 = foldr1
x (map x0 x2)", a map followed by a fold. So the problem is again reduced to
looking how functions are being chained together. For larger examples of
point-free expressions I'd try to isolate individual functions as where
clauses.
[0] As mentioned by cgmg, the Writer monad is useful if you need to know the
values going in and out of functions. [1]
[http://github.com/23Skidoo/pointful](http://github.com/23Skidoo/pointful)
------
slackingoff2017
Massive hype squeezing functional programming into all kinds of imperatively
shaped holes, code that's going to remind you of your dad's 80's pictures
someday.
------
visarga
Disadvantages: Sometimes it takes more work. Quick and dirty can be easier to
read, if the function is short.
Advantages: You can mix and match. Pure functions are more composable and
easier to test because you don't have to worry about side effects.
------
fooker
Difficult to write a good compiler.
Optimizing the code generated by the frontend for a function language is a
nightmare.
There are several theoretical reasons why that is the case, so do not expect
significant breakthroughs for general purpose functional programming.
~~~
johncolanduoni
Care to go into any detail?
------
doug1001
well i love fp, but the reality is that fp is not even an afterthought in the
libraries that underlie numerical computation--most significantly, BLAS,
LAPACK. Those libraries rely entirely on _mutable_ data structures.
it has very little to do with their age--fact is these libraries are
continuously updated (last major re-write for LAPACK was 2008; what's more, fp
is not new, it was a well-known paradigm when, say MKL was conceived (2003),
but apparently even the most cleverly designed persistent data structures (eg,
Scala's vector) aren't enough.
------
smnplk
No disadvantage. Learn your FP, it's good for you. ;)
------
dboreham
It's fine as long as you use spaces not tabs.
------
youdontknowtho
one disadvantage is all of the people telling you to rewrite it in Rust.
Seriously, why are you such a bad person?
*just kidding. mean no offense.
------
charlysl
I am currently studying the excellent [https://www.amazon.com/Concepts-
Techniques-Models-Computer-P...](https://www.amazon.com/Concepts-Techniques-
Models-Computer-Programming/dp/0262220695), and I believe that this book
answers the questions in the OP very clearly, and although maybe you find it
too theoretical, it does in fact provide loads of practical advice, and is
very readable; not for the faint of heart though ;)
Anyway, just to practice what I've learned so far I will try to answer some of
your questions from the top of my head; apologies in advance for my verbosity:
First of all, let's define functional (in fact, to be strict, declarative;
more on this below):
An operation (i.e. a code fragment with a clearly defined input and output) is
functional if for a given input it always gives the same output, regardless of
all other execution state. It behaves just like a mathematical function, hence
the name.
This gives a declarative operation the following properties:
1) Independence: nothing going on in the rest of the world will ever affect
it.
2) Statelessness (same as immutability): there is no observable internal
state; the output is the same every single time it is invoked with the same
input.
3) Determinism: the output depends exclusively on the input and is always the
same for a given input.
So what is the difference between functional and declarative? Functional is
just a subset of declarative = declarative - dataflow variables
These properties give a functional program the following key benefits:
1) It is easier to design, implement and test. This is because of the above
properties. For instance, because the output will never vary between different
invocations, each input only needs to be tested once.
2) Easier to reason about (to prove correct). Algebraic reasoning (applying
referential transparency for instance: if f(a)=a^2 then all occurences of f(a)
can be replaced with a^2) and logical reasoning can be applied.
To further explore the practical implications of all this lets say that, given
that all functional programs consist of a hierarchy of components (clearly
defined program fragments connected exclusively to other components through
their inputs and outputs) to understand a functional program it suffices to
understand each of its components in isolation.
Basically, despite other programming models having more mindshare (but, as far
as I can tell, aren't really better known, and this includes me ;), because of
the above properties functional programming is fundamentally simpler than more
expressive models, like OO and other models with explicit state.
Another very important point is that it is perfectly acceptable and feasible
to write functional programs in non strictly functional languages like Java of
C++ (although not in C, I won't explain why, it's complicated but basically
the core reason has to do with how memory management is done in C).
This is because functional programming is not restricted to functional
languages (where the program will be functional by definition no matter how
much you mess up).
A program is functional if it is observably functional, if it behaves in the
way specified above.
This can be achieved in, say, Java, with some discipline and if you know what
you are doing; the Interpreter and Visitor design patterns are exactly for
this, and one of the key operations to implement higher order programming,
procedural abstraction, can easily be done using objects (see the excellent
MIT OCW course [https://ocw.mit.edu/courses/electrical-engineering-and-
compu...](https://ocw.mit.edu/courses/electrical-engineering-and-computer-
science/6-005-elements-of-software-construction-fall-2008/index.htm) for more
on this).
Because of its limitations, it is often impossible to write a purely
functional program. This is because the real world is statefull and
concurrent. For instance, it is impossible to write a purely functional
client-server application. How about IO or a GUI? Nope. I don't know Haskell
yet, it seems they somehow pull it off with monads, but this approach,
although impressive, is certainly not natural.
Garbage collection is a good thing. It's main benefit to functional languages
is that it totally avoids dangling references by design. This is key to making
determinism possible. Of course, automatically managing inactive memory to
avoid most leaks is nice too (but not all leaks, like, say, references to
unused variables inside a data structure, or any external resources).
However, functional programs can indeed result in higher memory consumption
(bytes allocated per second, as opposed to memory usage, which is the minimum
amount of memory for the program to run), which can be an issue in simulators,
in which case a good garbage collector is required.
Certain specialised domains, like hard real time where lives are at stake,
require specialised hardware and software anyway, never mind whether the
language is functional or not.
So, for me, for the reasons above, the take home lesson so far is:
Program in the functional style wherever possible, it is in fact easier to get
right due to its higher simplicity, and restrict and encapsulate statefulness
(and concurrency) in an abstraction wherever possible (this common technique
is called impedance matching).
Each programming problem, or component, etc, involves some degree of design
first, or modelling, or a description, whichever word you prefer, it is all
the same. There are some decisions you must make before coding, TDD or no TDD.
What paradigm you choose should depend first on the nature of the problem, not
on the language. Certain problems are more easily (same as naturally)
described in a functional way, as recursive functions on data structures. That
part of the program should be implemented in a functional way if your language
of choice allows that .
Other programs are more easily modelled as an object graph, or as a state
diagram (awesome for IO among other things), and this is the way they should
be designed and implemented if possible. But even in this case, some
components can be designed in a functional way, and they should be wherever
possible.
There is no one superior way, no silver bullet, it all depends on the context.
It is better to know multiple programming paradigms without preferring one
over the other, and apply the right one to the right problem.
------
ankurdhama
Higher order functions (taking/returning functions) in practice can lead to
code that is very convoluted.
------
suff
The article describes functional programming in its purist form: 0 side
effects. In practice that simply doesn't exist in any language except Haskell
and maybe variants of LISP. Next is the simple fact that a pattern or
functional idiom has never and will never save your project, so adopting it
for those reasons is simply not practical. Also it is not practical to reduce
your hiring pool by one or more orders of magnitude. No profitable company
you've worked with this week relies completely on functional programming. All
of the most profitable companies in the world have side effects in their code.
Deviate from the path at your own risk.
~~~
RangerScience
> The article describes functional programming in its purist form: 0 side
> effects.
Incorrect, reread the first two paragraphs:
> Of course one can define functional programming so that no local mutable
> state and no side effects are possible, and then point out the obvious
> disadvantages. But that's perhaps a "no true Scotsman" kind of argument. If
> you would define object-oriented programming with the same strictness,
> everything that is not an object and uses mutable values would be forbidden.
> Even something such as y = sin(x), copy-on-write or returning constant
> objects or containers from a function would be off-limits. That is, I am
> asking with a very pragmatic definition of FP in mind!
> What I am wondering is rather, do you observe pragmatic functional
> programming as John Carmack described it here as valuable? And of course
> this question goes also to people who have tried it to some extend, as a
> purely theoretical discussion would be boring. I am interested in good
> examples!
~~~
seanmcdirmid
Pragmatically speaking, most non trivial programs consist of functional and,
yes, even object-oriented styled code. Doing functional programming over those
nounish state-indexed things doesn't make them non objects even if you prefer
to call them "entities" instead, while most OOP programmers use lambdas freely
without any feelings of regret.
~~~
eyelidlessness
The thing about OOP that is avoided in FP isn't calling an entity an "object",
it's the fact that the same object changes over time. It's hard to reason
about "I let this so-called function change a thing and I can't know how or
why". It's much easier to reason about "I gave this function a piece of data
and it gave me the change back".
~~~
seanmcdirmid
Entities externalize state that can vary over time. So it isn't f(g) but
f(g(t)). Time (and hence mutation) really does exist, whether it is implicit
or explicit.
It is only easier to reason about when the data you give the function is
limited. You can always pass the world into the function and take it as an
output, then it no longers matter. Effect systems are also possible with
objects as well, they just aren't a feature of functional languages.
------
IvanK_net
I am genuinely convinced, that all such topics have nothing to do with
science. Such question is more philosophical or religious. Many people will
try to explain their ideologies and believes about how things should work.
Each side may try to say what is correct, while nobody has a proof.
Discussing such topics has no scientific meaning, it is a waste of time and it
should be considered a clickbait :)
~~~
RangerScience
OP didn't mention science at all. They were asking for real-world experiences,
actually.
So, you're meaninglessly correct. Such questions do not have to do with
science, which is why science was not featured in the question.
~~~
IvanK_net
The subject is scientific, so it sounds like e.g. "what are disadvantages of
complex numbers". I am not talking just about science. I mean the whole
purpose.
Such discussion will have no effect on people, who truly know what functional
programming is. It also may have incorrect or misleading effects on people,
who don't fully understand, what functional programming is.
~~~
pharrington
To expand on Ranger's point, the OP's question is specifically requesting
people's anecdotal experiences and opinions concerning applied knowledge.
Science is the repeatable process of discovering and accurately modeling
nature (for varying definitions of "nature").
| {
"pile_set_name": "HackerNews"
} |
People reject info on existence of a prob if they object to possible solutions - shawndumas
http://arstechnica.com/science/news/2011/04/politics-self-confidence-trump-education-for-climate-change.ars
======
xbryanx
Yale's Six America's study is a great companion read to this article.
<http://environment.yale.edu/climate/>
It demonstrates six different types of attitudes towards Global Warming and
helps to explain the social dynamics associated with each opinion with a bit
more nuance than the Republican/Democrat divide in the decidedly fascinating
polling research in the arstechnica link.
~~~
shawndumas
link to the report pdf:
[http://environment.yale.edu/climate/files/Knowledge_Across_S...](http://environment.yale.edu/climate/files/Knowledge_Across_Six_Americas.pdf)
------
hugh3
Alternatively, people are more likely to believe information on the existence
of a problem if they like the possible solutions?
e.g. "You have a severe dietary deficiency. It can only be resolved by eating
more chocolate"
------
iterationx
I'll bet it would have gained a lot more traction if Al Gore hadn't
popularized it. You lose 50% of the country right there.
~~~
hugh3
I don't think it's really fair to say that Al Gore popularized it. People were
certainly widely talking about global warming (though usually under the name
"the greenhouse effect") as long ago as I can remember, which is about 1990.
Gore didn't hitch onto the bandwagon until sixteen years later.
The real reason for the left-right divide is that the most widely proposed
solutions always seem to involve more taxes and more government intervention.
They are also, in my opinion, _not_ the most effective way to spend that money
on CO2 emissions reduction.
Ultimately it's a technological problem and demands technological solutions --
we need to find ways to cheaply and efficiently produce the same amount of
energy without burning fossil fuels. The solution is therefore throwing more
money at R&D, not attempting to coerce people into using less energy.
The "cash for clunkers" program, for instance, cost $3 billion. The budget of
the National Renewable Energy Laboratory in Golden, Colorado, which is doing
some really good work, is about $130 million a year.
~~~
kenjackson
While there is a right-left divide, I think it is actually a different divide
than what you suggest, because even R&D solutions propopsed by the left are
shunned by the right.
The issue that doesn't come up nearly as much, but you'll hear if you talk to
hardcore social conservatives is that this planet is for humans to do as we
please. And I've been in discussions where they will quote specific bible
verses. Their view is that the secular left doesn't believe in God and
therefore thinks they need to fix a problem that doesn't exist.
------
stretchwithme
I think that's true of many people (responding to the headline).
Many people also accept existence of a problem more easily if they relish the
possible solutions.
For example, our currently developing budget disaster. I find it easy to talk
about because it finally means some finite end to dumb government spending.
But we don't exactly know how far the dumbness could continue or if sanity
will kick in soon or if the die is cast.
And, on the other side, there are people who love government spending who
claim its not a problem at all.
~~~
orangecat
_Many people also accept existence of a problem more easily if they relish the
possible solutions._
Indeed. Testable prediction: as solar and wind power become more economically
viable, harder-core environmentalists will discover many reasons why they're
unacceptable.
~~~
cydonian_monk
You can test that prediction now. There are rumblings about the damage wind
power does to local wildlife, and deeper questions about how large arrays of
wind turbines change weather patterns by drawing too much energy / increasing
airflow resistance.
Some (random, via quick Google search) sources:
[http://www.livescience.com/7626-wind-farms-change-
weather.ht...](http://www.livescience.com/7626-wind-farms-change-weather.html)
[http://www.gardenridge.net/wind-turbines-changing-
weather.ht...](http://www.gardenridge.net/wind-turbines-changing-weather.htm)
~~~
kenjackson
They note climate change, but to quote:
""The message here is climate change, but that doesn't equal global warming,"
Keith said. "It's possible this would have benefits," by working against the
atmospheric effects of fossil fuel consumption on global climate, he said. "
------
zem
Faced with the choice between changing one's mind and proving that there is no
need to do so, almost everyone gets busy on the proof. -- John Kenneth
Galbraith
| {
"pile_set_name": "HackerNews"
} |
How verizon failed to figure out a migration strategy for contacts - smoothgrammer
https://twitter.com/smoothgrammer/status/642026817956769792
======
smoothgrammer
Basically, their developers couldn't write a SQL statement to get data from
the database holding contacts in their 'old' backup assistant application
backend, and move it to their 'cloud' backend. They failed to notify customers
about needing to use the cloud app.
This is a sleeping time bomb, and their customer service has a IDGAF attitude.
This is a great example of how to lose customers, and is a good case study for
people on HN.
As a long term lurker I think people would appreciate this.
| {
"pile_set_name": "HackerNews"
} |
Paypal hits 100m accounts, predicts end of wallet by 2015 - danboarder
http://www.thepaypalblog.com/2011/06/paypal-crosses-first-100-million-active-accounts-4/
======
michaelpinto
I think I'd trust anybody but Paypal with my wallet — I once tried tried to
delete an account and it seemed to take forever. Owning that space is going to
require a company that understands customer service as much as technology.
~~~
danboarder
But who else has the reach? I consider Facebook and Google to be advertising
companies, and would rather have a dedicated financial services company focus
on my transactions. Perhaps Square, as it grows? Or will a decentralized
solution like Bitcoin win at the end of the day?
~~~
michaelpinto
Any company that knows online ecommerce has that potential: The two that come
to mind are Amazon and Apple with iTunes. Also I wouldn't write Facebook and
Google off, for example Facebook started by selling virtual goods and is great
at getting your "real identity".
| {
"pile_set_name": "HackerNews"
} |
Memoro: A Detailed Heap Profiler - matt_d
https://epfl-vlsc.github.io/memoro/
======
cjhanks
Thanks for building, I'll consider in the future. Though, I wish the fracture
in C++ compilation wasn't so frustrating.
LLVM has really helped pushed the boundary of compiler feedback. But still,
GCC wins consistently in my experiments of performance.
These days it seems you have to realistically target both and their sanitizer
capabilities. Of which, both have drastically improved since CLANG said 'Hello
world's.
~~~
ryanpetrich
It could still be better, but C++ compilers have never been as compatible or
interchangeable as they are now.
------
brendangregg
Nice! Page should state the overhead clearly -- don't let people discover that
the hard way. The paper says 2-30x, and 3.3-5.7x is typical. I'm not
surprised. And it's preferable to state it and not pull any punches -- it sets
expectations and lets us take precautions when running it.
| {
"pile_set_name": "HackerNews"
} |
Beloved Pets Everlasting? - ksvs
http://www.nytimes.com/2009/01/01/garden/01clones.html?_r=1
======
geuis
Hmm, its requiring me to register and login to read an article? I thought the
major news sites were past this childish mentality.
~~~
kirubakaran
bugmenot.com
| {
"pile_set_name": "HackerNews"
} |
Remembering Chris Kraft - selimthegrim
https://www.nasa.gov/chris-kraft
======
scj
During John Glenn's first orbital flight, an indicator went off, causing
concern over the heat shield.
Kraft and his team correctly determined the heat shield was fine. His
judgement was ignored and he was forced to use an untested and unnecessarily
risky procedure. Decades later, both Glenn and Kraft went on the record
denouncing what happened. As a result, Kraft would define NASA policy as "the
flight director may, after analysis of the flight, choose to take any
necessary action required for the successful completion of the mission."
I can't imagine how Apollo 13 would have unfolded if this policy was not in
place.
------
sizzzzlerz
Kraft literally wrote the book on how space missions were to be run. He wrote
all the procedures that you see being executed in Houston's Mission Control,
devising the roles and responsibilities of each individual from "Flight" to
CapComm. If there is a giant in the American space program, it is Christopher
Kraft. RIP.
------
AstroJetson
He is sort of the father for any type of complex procedure systems out there
today. In prior jobs, references back and setups that came from the space
program were pretty common.
Lots of us have worked in or around "mission control's" at companies (think
any of the Telcoms, power distribution etc), so thanks Chris for your
guidance!
------
aqme28
I highly recommend his memoir, "Flight." It's a fantastic story about
engineering and the birth of the space age.
------
jlampa
There's a documentary featuring interviews with him - and many of the people
who worked with him - on Netflix:
[https://www.netflix.com/watch/80175483](https://www.netflix.com/watch/80175483)
------
sswaner
With Kraft's passing and the 50th anniversary of the moon landing, it is
valuable to reflect on the many ways the space programs of several countries
have substantially contributed to our work now. There is an elegance and
structure to the work of mission control that is such a useful guide for how
to deliver and operate technology.
It is sad to lose such talent and vision, but good to know that what he and
his team created is so readily available to us now.
------
__sy__
Oh, this makes me sad. +1 on people recommending his book "Flight". I very
much enjoyed it too. I'd also recommend watching his lecture at MIT a decade
or so ago:
[https://www.youtube.com/watch?v=ZPpq1YNVKQY](https://www.youtube.com/watch?v=ZPpq1YNVKQY)
------
dang
Related:
[https://news.ycombinator.com/item?id=20504164](https://news.ycombinator.com/item?id=20504164)
| {
"pile_set_name": "HackerNews"
} |
Import Delicious To Google Bookmarks - darkhelmetlive
http://blog.darkhax.com/2010/12/16/import-delicious-to-google-bookmarks
======
rsanders
This is much faster and doesn't ask for any auth info:
<https://gist.github.com/744312>
~~~
darkhelmetlive
That's definitely cool. Are there any docs for that "upload" API?
It's good to see people find solutions to the problem Yahoo! is causing.
~~~
darkhelmetlive
Ahh I see according to <http://lnkr.mobi/bm/google_bookmarks_api/> that is
about all the docs on that API there are, and it doesn't support
annotations/notes. Mine doesn't either, but it's a one line change to make it
do so.
| {
"pile_set_name": "HackerNews"
} |
FBI in talks with Apple, Google over device encryption policies - therealmarv
http://9to5mac.com/2014/09/25/fbi-apple-privacy-encryption/
======
swasheck
what's (not) shocking is the double-speak. "i believe that nobody is above the
law" when that's not been the case for quite a while. additionally, use that
as a case against apple and google touting encryption is an insult to the
collective intelligence of the populace. it's because there is demonstrable
evidence that the fbi/cia/other organizations are acting outside the
jurisdiction of the law that citizens are "protecting themselves." not to
start a flame war here (or a red herring, and if it is then let me know), but
this sort of self protection seems to have been the impetus behind the bill of
rights, in general. it's the right to protect one's self from the overreaching
of government.
------
higherpurpose
I was afraid this would happen. Either they will back down on these policies,
or worse, they will make the moves to make FBI happy behind the curtain, while
promising everyone they still made the changes to protect your privacy.
| {
"pile_set_name": "HackerNews"
} |
Ask HN: Is python NLTK library still used for word tokenization? - pandeykartikey
======
mlthoughts2018
Yes, I work on a team that uses NLTK for lots of word canonicalization tasks
in an NLP-heavy search engine. There are other options that work well too, but
we have found NLTK to be very good, even at a large scale.
Our pipeline uses NLTK to take in a string of text, do word tokenization,
lemmatization and stemming, and construct bigrams and trigrams, as part of a
large map-reduce job for building text search indices.
| {
"pile_set_name": "HackerNews"
} |
Pirate Bay to allow real-object downloads - reverend_gonzo
http://www.zdnet.com.au/pirate-bay-to-allow-real-object-downloads-339330303.htm
======
kiloaper
So this is thingiverse.com but presumably without the prevention of copyright
infringement[1]? As a 3D printer owner Thingiverse is awesome. I encourage
people to check it out, if anything just to see how far DIY 3d printers have
come.
[1] [http://blog.thingiverse.com/2011/02/18/copyright-and-
intelle...](http://blog.thingiverse.com/2011/02/18/copyright-and-intellectual-
property-policy/)
| {
"pile_set_name": "HackerNews"
} |
How 4 Mexican Immigrant Kids and Their Cheap Robot Beat MIT (2005) - sriram_sun
http://www.wired.com/2014/12/4-mexican-immigrant-kids-cheap-robot-beat-mit/
======
sriram_sun
OK I skimmed through the article. Not much technical content here. The
original was published in 2005. Here is what they are doing right now.
[http://www.wired.com/2014/12/spare-
parts/](http://www.wired.com/2014/12/spare-parts/) pretty depressing when
compared to what they managed to accomplish 9 years back. Hopefully they are
happy.
~~~
joezydeco
Looks like it's mostly a marketing push for the movie.
------
lettercarrier
I had tears when I read the entire thing. I thought it was well written
The absolute best article (which I have a photo copy of) is this one [1] about
a 14 year old boy at a baseball camp. It is by Ira Berkow of the Times.
[http://www.nytimes.com/1986/12/25/sports/sports-of-the-
times...](http://www.nytimes.com/1986/12/25/sports/sports-of-the-times-vince-
s-story-with-an-assist-by-mickey.html)
| {
"pile_set_name": "HackerNews"
} |
Async, embeddable HN share button - igrigorik
https://github.com/igrigorik/hackernews-button
======
huhtenberg
There's nothing more sad and somewhat pathetic than a "share on... " button
with a single digit counter. Perhaps hide the counter if it's under
configurable threshold?
~~~
igrigorik
That's a good idea. Ideally a combination of time and points actually: if it's
a new article then show the counter, but if its older and low points, then
hide the button.
------
igrigorik
Working on auto-instrumenting the button with Google Analytics trackSocial
stuff - stay tuned.
~~~
igrigorik
GA instrumentation should be in place now. Go into GA > Traffic Sources >
Social Plugins, and you should see it there. (don't forget to change the date
range)
------
stbullard
cf. last year's discussion of <http://hnlike.com>:
<http://news.ycombinator.com/item?id=2934178>
------
nilved
For something unrelated, what's the point of code like this?
<script>
(function() {
var hn = document.createElement('script'); hn.type = 'text/javascript';
hn.async = true; hn.src = 'http://hnbutton.appspot.com/static/hn.js';
var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(hn, s);
})();
</script>
Why not just put in the generated script tag? Moreover, why does it matter
where the tag is placed if it's loaded with async as opposed to defer?
~~~
igrigorik
For the HN audience you probably could, since we're all on modern browsers.
The pattern above is what GA uses to guarantee most consistent behavior across
all browsers. It comes down to browser support for the async keyword:
[http://stackoverflow.com/questions/1834077/which-browsers-
su...](http://stackoverflow.com/questions/1834077/which-browsers-support-
script-async-async/6766189#6766189)
As far as async vs. defer, here's a good summary:
<http://www.sitepoint.com/non-blocking-async-defer/>
------
DanielRibeiro
Great! It would be awesome if it can be made to work with Wordpress (the
vanilla hosted site). Or maybe I should just start blogging like a hacker[1,
2]...
[1] [http://tom.preston-werner.com/2008/11/17/blogging-like-a-
hac...](http://tom.preston-werner.com/2008/11/17/blogging-like-a-hacker.html)
[2] [http://www.allthingsdistributed.com/2011/08/Jekyll-
amazon-s3...](http://www.allthingsdistributed.com/2011/08/Jekyll-
amazon-s3.html)
------
nodesocket
Great work, but are you serving `hn.js` from a CDN like CloudFront, so in the
event that your website goes down, it doesn't effect others. Also, the url the
iframe calls: [http://hnbutton.appspot.com/button?title=some-
title&url=...](http://hnbutton.appspot.com/button?title=some-title&url=some-
url), how can we be sure that is always going to be available?
~~~
fraserharris
*.appspot.com is Google App Engine. It has built-in redundancy.
------
zobzu
It's a nice button, but, nothing like have 2 HN, FB, Twitter, G+ buttons.
I don't think the "social sharing" idea behind all those buttons is smart at
all. It allows for an eventual single winner only, and crap in between (and
crap after the winner is "elected" since it wont be cool anymore, everyone
will have to have a new button side by side)
------
alpb
I have just installed it to my blog (example:
<http://ahmetalpbalkan.com/blog/getting-things-done-for-devel...>) however
submit button never disappears. I would expect it to show 1 points. Can you
show me a page that vote count actually works?
~~~
igrigorik
The data is coming from thrifdb API (like all other buttons), so there is a
small delay in the "true count" and what gets displayed in the button. From
what I can tell, its ~within minute(s) accuracy.
~~~
fraserharris
Can you expand on thrifdb API? Can't find any relevant search results.
~~~
riffraff
I believe he means <http://www.hnsearch.com>, I used it to fetch story data
too in the past, and I believe hnlike used to work like that too.
------
eps
> <http://hnbutton.appspot.com/static/hn.js>
Why is this .js not on Github?
~~~
igrigorik
As in, served through Github's CDN (akamai)? Google's App Engine has a few
GFE's around the world as well.. :)
------
overshard
This has been done a few times before but still, well done. Works very well.
| {
"pile_set_name": "HackerNews"
} |
Ask HN: Can anyone recommend a db for managing connected devices in a LAN? - ng-user
I'm looking for an intelligent approach to storing multiple MAC addresses (devices), their status in realtime ie. online vs offline, as well as other generic fields (frequency, IPV4, IPV6, etc.) in presumably a database. I've got a node application fetching data from an API about once every minute, I suppose I could populate a mongo collection for simplicity and keep track of online/offline status to create a realtime-ish update of the connected devices. I feel like there's probably a better way that I'm unaware of. Can anyone recommend something for a problem such as this?
======
stargrazer
manually: netbox
[https://github.com/digitalocean/netbox](https://github.com/digitalocean/netbox)
automatically: netdisco
[https://metacpan.org/pod/App::Netdisco](https://metacpan.org/pod/App::Netdisco)
| {
"pile_set_name": "HackerNews"
} |
A realtime, platform-agnostic error logging and aggregation platform - wspeirs
https://github.com/getsentry/sentry
======
mindcrime
FYI, there is another fairly well-known open source project called Sentry:
[http://sentry.incubator.apache.org/](http://sentry.incubator.apache.org/)
Are they close enough to be considered a proper name clash? Not sure, but it's
worth thinking about.
~~~
wspeirs
Well they're spelled exactly the same, but don't come close to doing the same
thing:
[https://getsentry.com/welcome/](https://getsentry.com/welcome/) \- An error
logging & aggregation platform
[http://sentry.incubator.apache.org/](http://sentry.incubator.apache.org/) \-
A system for enforcing fine grained role based authorization to data and
metadata stored on a Hadoop cluster
Picking a unique open source project name is getting to be about as hard as
picking a not-yet-reserved domain name :-\
~~~
mindcrime
_Picking a unique open source project name is getting to be about as hard as
picking a not-yet-reserved domain name :-\_
Very true. Too bad projects don't come with tld's.
Sentry.hadoop
Sentry.logging
or something like that. :-)
| {
"pile_set_name": "HackerNews"
} |
Sharovipteryx - ag8
https://en.wikipedia.org/wiki/Sharovipteryx
======
bediger4000
The first "restoration" is obviously wrong. The center of lift of a subsonic
wing is roughly at the quarter-chord point, which is in front of the beast's
hips. The CG of the beast is well forward of that. That restoration is going
to cause pitch-down, a nose dive.
| {
"pile_set_name": "HackerNews"
} |
How Radioactive Poison Became the Assassin’s Weapon of Choice - NN88
https://medium.com/matter/how-radioactive-poison-became-the-assassins-weapon-of-choice-6cfeae2f4b53#.hax0s8sm4
======
UnoriginalGuy
Holy heck this article is all over the place, history of nuclear physics? How
radiation works? A full history of everyone involved? Uhh... I am patient and
love a good story but this got old because it couldn't decide what it wanted
to be and jumped from around...
According to read time this article takes 25-60 minutes to read. I made it
half way down before I got fed up.
| {
"pile_set_name": "HackerNews"
} |
People with multi monitor setups with different DPI, how is your experience? - Secretmapper
I'm thinking of getting a 43" 4k monitor. The real estate is going to be AMAZING.<p>The only thing I'm worried about is the DPI. I use a 2017 MBPr w/ a Dell P2415Q, both of which are ~180+ PPI<p>I used to have a 24" 1080p and side by side, it became unbearably blurry. For those that have the same rough setup, do you think I would find the LG monitor blurry? or do you find putting it further back helps?<p>On the other hand, anyone here who have experience 32 inch 4k with 24 inch 4k? That would put the PPI at ~150PPI - ~180 PPI.<p>I'd love to get the 43", but I'm not sure if I'll find it 'blurry'. No retailer in my area has it, and I have to ship it internationally so I really want to know if it's going to be feasible before I jump the gun.
======
bsenftner
I have an odd monitor setup: main monitor is a 23" 4K, above that is our old
HD TV - a 36" screen with 1080p resolution. To the right is an old 19" Mac
monitor with the odd resolution of 1680x1050, and on the left is an ultra-wide
2K monitor with 2560x1080 resolution.
The old HD TV is certainly not as crisp as the 3 computer monitors - so I have
it located behind and about 3 feet back and up. At that distance the fuzzy
pixels are lost, and I can use it as a reference monitor quite well.
I have the funky setup so I can insure the software I write handles layout
correctly with the different aspect ratios and pixel scaling factors the OSes
throw in.
| {
"pile_set_name": "HackerNews"
} |
Will Node.js forever be the sluggish Golang? - alexhultman
https://medium.com/@alexhultman/will-node-js-forever-be-the-sluggish-golang-f632130e5c7a
======
coldtea
> _It is productive and elegant, sure, but lacks in performance. Emerging are
> projects like Fastify, and hundreds alike. They all aim to provide what
> Express does, at a lower performance penalty. But that’s exactly what they
> are; a penalty. Not an improvement. They’re still strictly limited to what
> Node.js can provide, and that’s not much as compared to the competition_
The whole article is badly written.
First it assumes that the lower performance is something insufferable -- when
in most cases, and for most project, it doesn't matter at all.
Then it fails to understand the important of developer pool, convenience,
ecosystem, etc, as if JS and Node could be willy nilly replaced by Golang for
every project.
Third, it pisses (as above) on Node web framework projects, just because
Node.js has a performance top (as a single process lower than Golang.
Also the importance of the overall architecture for performance is not
accounted at all -- or the fact that as long as you add some database queries
the speed benefit over Node diminishes...
~~~
alexhultman
I'm sorry the article wasn't up to your standards.
~~~
coldtea
It's mostly the overall tone hammered on, e.g. starting with: "Will Node.js
forever be the sluggish Golang?" which is click-baity.
In fact I gave up around the insults at various Express-successors, and missed
that the main point of the article is the µWebSockets.js.
Why not go into that directly and skip the rest? That's what people would want
to know, not that "Node is worse than Golang in raw performance".
~~~
alexhultman
I thought you said you had assessed the _whole_ article as bad? You didn't
read past the ingress, the motivation of the work done.
~~~
craftinator
He obviously went and reviewed the rest, or how would he know about the
uWebsockets portion? His point is that the article is poorly written enough
that he had no desire to finish it, and I fully agree. More information and
less disparaging fluff would make this piece much more attractive to the
engineers of the world.
------
philwelch
For Node to be a “sluggish Golang” in the first place, it would have to be
otherwise functionally equivalent to Golang, which it is not. Writing your own
TCP and SSL layers is also kind of nuts if your motivation for doing so is to
optimize your Node apps.
~~~
alexhultman
Node.js is very popular and some companies with lots of I/O still prefer
JavaScript, they have their business logic written in it.
~~~
philwelch
If it was just, "I wrote a custom network stack to make your Node apps run
faster", that would be something, but Golang is completely irrelevant to the
question.
~~~
alexhultman
That's a good title. Golang is referenced many times in the text though, it's
even plotted in the two graphs.
------
makkesk8
uWebsockets is an awesome project! Been following it since the start pretty
much, And I'm really glad more people out there care more about performance
than throwing more servers at the problem.
Keep it up alex!
------
yayr
Thanks to OP and contributors for this progress on performance. What are still
feature gaps and what is the roadmap?
| {
"pile_set_name": "HackerNews"
} |
Intel to Develop Discrete GPUs, Hires AMD's Raja Koduri as Chief Architect - namlem
https://www.anandtech.com/show/12017/intel-to-develop-discrete-gpus-hires-raja-koduri-as-chief-architect
======
unsigner
The connection between "AI" and "GPU" in everyone's mind is a testament to the
PR chops of NVIDIA. You don't need a GPU to run ML/DL/neural networks, but
NVIDIA have GPU tech so they're selling GPUs. What you need is the massive ALU
power and, to lesser extent, the huge internal bandwidth of GPUs. There are
huge chunks of GPU die area that are of no use when running NN-type of code:
the increasingly complex rasterizers, the texture units, the
framebuffer/zbuffer compression stuff, and on the software side, the huge pile
of junk in the drivers that allows you not only to run games from a decade
ago, but also run them better than last year's GPU. If you can afford to start
from scratch, you can lose a lot of this baggage.
~~~
mtgx
And yet Intel seems to be wanting to make GPUs for machine learning now...so I
guess Nvidia's PR worked against Intel, too?
But as I said in another comment, the truth is Intel doesn't seem to be
knowing what it's doing, which is why it's pushing in 5 or 6 different
directions with many-core accelerators, FPGAs, custom ASICs, neuromorphic
CPUs, quantum computers, graphcores, and so on.
By the time Intel figures out which one of these is "ideal" for machine
learning, and behind which arrows to "put more wood," Nvidia will have an
insurmountable advantage in the machine learning chip market, backed by an
even stronger software ecosystem that Intel can't build because it doesn't yet
know "which ML chips will win out".
If I would describe Intel is a sentence these days is "Intel doesn't have a
vision." It's mostly re-iterating on its chips and rent-seeking these days by
rebranding weak chips with strong chip brands, and adding names like "Silver"
and "Gold" to Xeons (and charging more for them, because come on - it says
_Gold_ on them!), as well as essentially bringing the DLC nickle-and-diming
strategy from games to its chips and motherboards.
Meanwhile, it's wasting billions every year on failed R&D projects and
acquisitions _because_ it lacks that vision on what it really needs to do to
be successful. Steve Jobs didn't need to build 5 different smartphones to see
which one would "win out" in the market.
~~~
newlyretired
Non-incremental advances require a lot of wasted-path R&D. If any of Intel's
projects creates a generational leap, it will pay off handsomely. When the way
forward isn't clear, I like to use the concepts from path finding algorithms
to drive strategy. Assuming you can afford multiple parallel efforts.
It's not clear if doing this in-house, or closely monitoring the state of the
art and then buying a company that develops a winner, is superior.
------
eganist
Can't fight them on price? Fight them on talent.
Whoever at AMD who refused to match the offer probably made a terrible
decision. This is about the worst time to lose that talent right after inking
a gpu die deal which, in light of this news, will only be temporary. AMD just
got played.
If I were AMD, I would review Mark Papermaster's comp and incentives to ensure
he doesn't leave.
(I'm long AMD)
~~~
eksu
I don’t think this was all about money. Raja had been trying to run Radeon
Technologies Group like an independent company and pushing for separation from
AMD for a while. HardOCP did a good piece on this ->
[https://hardocp.com/article/2016/05/27/from_ati_to_amd_back_...](https://hardocp.com/article/2016/05/27/from_ati_to_amd_back_journey_in_futility)
I think the recent Intel + AMD custom chip was probably the last thing Raja
did before he got pulled RTG got the reins put back on and now he’s hoping
ship to peruse what he’s wanted all along. To work with more independence.
~~~
Terribledactyl
What leads you to think Intel will give him more autonomy? Or is their
integrated graphics team setup differently than Radeon? I would suspect the
old ATI boundaries after 11 years would still be stronger than something Intel
has homegrown over the past 20.
~~~
0xbear
Well, he wouldn’t go there if Intel wouldn’t give him what he wanted, whatever
that might be.
~~~
MengerSponge
And if it's not corporate structure, maybe it's a boat.
More power to him.
------
ActsJuvenile
Raja, if you are reading this make sure your Intel GPU has two things that
competition doesn't:
1) FP8 half-precision training: NVidia is artificially disabling this feature
in consumer GPUs to charge more for Tesla / Volta.
2) A licensed / clone of AMD SSG technology to give massive on-GPU memory:
NVidia's 12 GB memory is not sufficient for anything beyond thumbnail or VGA
sized images.
My experience with Intel Phi KNL has been miserable so far, I hope Raja has
better luck with GPU line.
~~~
rbanffy
> My experience with Intel Phi KNL has been miserable so far, I hope Raja has
> better luck with GPU line.
I'd love to see the Phi approach taken further. I'm not a huge fan of having
different ISAs, one for my CPU, one for the compute engines of the GPU (to say
nothing about the blobs on my GPU, network controller, ME). I'd prefer a more
general approach where I could easily spread the various workloads running on
my CPU to other, perhaps more specialized but still binary-compatible, cores.
Heck... Even my phone has 8 cores (4 fast, 4 power-efficient, running the same
ISA).
~~~
Symmetry
When you've got a huge out of order engine the extra effort it takes to decode
x86 instructions is lost in the noise. When you're going with the flock of
chickens approach and you have a huge number of very small cores then the
overhead is killer. Intel tried to solve this by using medium cores with big
SIMD units but SIMD is just less flexible than a GPU's SIMT is.
Power and area generally scale as the square of the single threaded
performance of a core. The huge number of "cores"/lanes in a GPU are much
smaller and more efficient individually than even your phone's smaller cores.
And the x86 tax gets worse and worse the smaller you try to make a core with
the same ISA. Intel wasn't even able to succeed in competing with the Atom
against medium sized cellphone chips.
~~~
rbanffy
> but SIMD is just less flexible than a GPU's SIMT is
There is nothing preventing the x86 ISA to be extended in that direction. As
long as all cores (oxen and chickens, as Seymour Cray would say) can shift
binaries around according to desired performance/power, I don't care.
Binary compatibility is awesome for software that has already been written and
for which we don't have the source code. Pretty much everything on my machines
has source available.
The OS may need to be more aware of performance characteristics of software
it's running on the slightly different cores so it can be better allocated,
but, apart from that, it's a more or less solved problem.
Atoms didn't perform that much worse than ARMs on phones. What killed them is
that they didn't run our desktop software all that well (even though one of my
favorite laptops was an Atom netbook).
------
bhouston
I am not convinced that Intel can win here. They seem to not succeed with home
grown GPU tech and other big bang approaches. Now if they were to acquire
decent GPU tech then I would bet on them. Just the homegrown route seems to
not work out for them.
I suspect part of the reason is the long time frames for dev of this tech. I
suspect it is at least 2 years for this to see the light of day. That is
forever in this space.
Intel failed with Larrabee and itanium. Maybe this will go better?
~~~
mtgx
It looks like Raja will lead the development of machine learning-focused GPUs.
Isn't this Intel basically admitting that their Xeon Phi, Nervana, and Altera
GPU efforts to win the machine learning market are all a dead-end?
How many machine learning strategies is Intel going to try? Does it even know
what it's doing? Spending billions of dollars left and right on totally
different machine learning technologies kind of looks like it doesn't, and
it's just hoping it will get lucky with one of them.
And even if you think that's not a terrible strategy to "see what works",
there's still the issue that they need to have great software support for
_all_ of these platforms if they want developer adoption. The more different
machine learning strategies it adopts, the harder that's going to be for Intel
to achieve.
~~~
ironchef253
Intel needs a CEO that skates to where the puck is going to be, not where it
was three goals ago.
~~~
keganunderwood
That's scary of that's what is required of a ceo. You would either need to be
an oracle and predict where the industry will go or you'd need to _make_ the
industry go the direction you're taking the company.
------
diab0lic
The link is incredibly light on actual content but this seems to be good news
for AI enthusiasts as perhaps now we'll get a reasonable competitor to
CUDA/CUDNN and their associated hardware for running GPU accelerated machine
learning. Intel seems to be taking the ML/AI space seriously and this move
seems very likely to be related. Yes I'm aware of OpenCL as I am also aware of
it's level of support with libraries such as PyTorch, Tensorflow, Theano -- it
isn't the first class citizen that CUDA is. While those libraries aren't
perfect they offer the experience of writing the experiment on your laptop
without a GPU, validating, then running the full experiment on larger
hardware.
In my ideal world competition from intel would force NVidia to play nice with
OpenCL or something similar, and encourage competition in the hardware space
instead of driver support space. Unfortunately the worst-case looks something
more like CUDA, OpenCL and a third option from Intel with OpenCL like
adoption. :(
~~~
moonbug22
Only armchair spectators are still talking about OpenCL. It's as dead as
disco.
~~~
joefourier
I still use OpenCL on the daily, it allows me to program for both NVidia and
AMD GPUS simultaneously with the minimum amount of pain. And you can still use
it for mobile GPUs and specialized chips like the Myriad.
Do you have another open, cross-platform, widely compatible GPU programming
framework to recommend?
~~~
gcp
Same here. What else can you do to ship GPU acceleration to AMD and NVIDIA
people?
The alternatives recommended here aren't even serious IMHO. I'd rather switch
to CUDA and wait till Intel/AMD sort out a REAL compatibility layer than deal
with those.
Unless I'm mistaken, HIP still requires a separate compile for either platform
and what runtime do they expect end users to have exactly?! At least CUDA and
OpenCL are integrated in the vendor drivers.
Vulkan compute with SPIR-V seems to be the only real solution, but even that
is still very early. Sill waiting for proper OpenCL 2.0 support in NVIDIA
drivers :P
~~~
freeone3000
You simply don't ship. Enterprise deep learning doesn't ship their training
code - large models are trained on purpose-designed, dedicated hardware.
Hardware compatability doesn't matter, software does. Even that's flexible if
it's significantly faster.
(The models can be executed on low-powered, commodity CPUs. No need for any
GPU there.)
~~~
gcp
_You simply don 't ship_
That's totally an option for our product, great idea! Why did I never think of
this!
No seriously we _are_ shipping, using _OpenCL_ and it gives about a 20 times
performance advantage for most users regardless if they have AMD or NVIDIA
hardware. If something that's actually better than OpenCL comes along (or if
AMD RTG goes out of business) I'll switch to it no heart broken.
But that hasn't happened yet.
------
JosephLark
Interesting, given that just 2 days ago it was announced [0] that Intel was
going to start to use AMD for some of their integrated graphics. Now they're
going to complete against them in the discrete graphics space.
Also, Koduri recently left AMD after what many felt was a disappointing
discrete graphics release in Vega.
[0] [https://www.anandtech.com/show/12003/intel-to-create-
new-8th...](https://www.anandtech.com/show/12003/intel-to-create-new-8th-
generation-cpus-with-amd-radeon-graphics-with-hbm2-using-emib)
------
loeg
Wowza. If I moved to a direct competitor like that, my employee contract "non-
compete" clause would be brought out immediately. And I'm no C-level
executive, just an individual contributor. I wish Washington had California's
non-compete law.
~~~
cakebrewery
Haven't read much on it, but this happening right after the integrated GPU
deal with AMD just strengthens the "teaming up against NVIDIA" theme going on.
~~~
Rarebox
It's like an Age of Empires FFA where the losing players always team up
against the leader.
------
chisleu
I'm excited. I don't care if Intel wins. I just want a video card that doesn't
suck and works perfectly with linux. Even if I unplug my monitor sometimes...
Even if it's a laptop and it switches GPU for different outputs... Even if I
want to use the standard xrandr and normal ass linux tools for configuring my
monitor.
~~~
orbifold
Maybe that would happen if kernel developers were not such divas and thought
it was appropriate to use coarse language in public discourse. Nvidias
graphics drivers work perfectly on windows and they have the only OpenGL
implementation that is not a total joke on Linux.
~~~
madez
Why do you say the Mesa OpenGL is a total joke?
~~~
orbifold
This is a bit outdated [http://richg42.blogspot.de/2014/05/the-truth-on-
opengl-drive...](http://richg42.blogspot.de/2014/05/the-truth-on-opengl-
driver-quality.html?m=1) overview of driver status by a game developer, vendor
A is NVIDIA and as the article points out they are the only one with a
performant relatively bug free implementation. Also notice how he mocks Intel
for having two driver teams: That Linux expects to get special treatment by
integrating the driver into their Graphics abstractions leads demonstrably to
worse performance and less features than if you bypass all those abstractions
and use essentially the same driver for all kernels.
~~~
madez
Thanks for that link. I must say I deem it maybe not completely outdated, but
at least worthy an update.
But I'm in awe of the what one can read there.
"This vendor[Nvidia] is extremely savvy and strategic about embedding its devs
directly into key game teams to make things happen. (...). These embedded devs
will purposely do things that they know are performant on their driver, with
no idea how these things impact other drivers.
(...)
Vendor A[Nvidia] is also jokingly known as the "Graphics Mafia". Be very
careful if a dev from Vendor A gets embedded into your team. These guys are
serious business."
So, basically Nvidia is sabotaging OpenGL to fuck up the specs and then
implement other working variations and make the game developers use their
version? If that is true, fuck Nvidia.
"On the bright side, Vendor C[Intel] feeds this driver team[Windows Driver
Team] more internal information about their hardware than the other team[Linux
Driver team]. So it tends to be a few percent faster than driver #1 on the
same title/hardware - when it works at all."
What the fuck is going on in this industry? Intel is sabotaging its own Linux
driver team? Why?
"I don't have any real experience or hard data with these drivers, because
I've been fearful that working with these open source/reverse engineered
drivers would have pissed off each vendor's closed source teams so much that
they wouldn't help.
Vendor A[Nvidia] hates these drivers because they are deeply entrenched in the
current way things are done."
That, now finally, makes sense. Nvidia is strong-arming developers to not
support Mesa because they are afraid of it. Nvidia is afraid of Mesa. I think
this should be more widely known.
~~~
orbifold
The way I read this was a bit different: NVidia actually is the only vendor
that offers a performant, complete and relatively bug free implementation. For
example if you consider this
[http://gdcvault.com/play/1020791/](http://gdcvault.com/play/1020791/)
presentation then it is relatively clear that most major innovations were
first available as OpenGL extensions by NVidia. The playing field might have
levelled somewhat with the introduction of vulkan, which eliminates a lot of
code that had to reside in the driver before. The main reason why Mesa is
unlikely to catch up is because the backend compiler code is platform
specific, so unless NVidia decides to publish their platform specification, it
is unlikely that they will achieve meaningful success. Even if NVidia did
publish a specification and left driver development to the community it is
unclear to me who would be willing to do the free work for them.
~~~
Nullabillity
Keep in mind that the blog post is from 2014. Since then AMD has rewritten
their Linux driver (fglrx -> AMDGPU) which didn't really pay off before their
4xx series (released 2016).
------
chx
> GT4-class iGPUs, which are, roughly speaking, on par with $150 or so
> discrete GPUs.
Erm. Nope. No Intel iGPU is on par with the 1050 much less the 1050 Ti.
[http://gpu.userbenchmark.com/Compare/Intel-Iris-
Pro-580-Mobi...](http://gpu.userbenchmark.com/Compare/Intel-Iris-
Pro-580-Mobile-Skylake-vs-Nvidia-GTX-1050-Mobile/m132950vsm211022)
(I compared mobile chips since the most powerful GT4 can only be found in the
mobile chips.)
It's only slightly behind the 1030 which costs $73.
~~~
unsigner
Look at it another way: no Intel iGPU is on par with any discrete GPU, because
in price segments where iGPUs appear, discrete GPUs tend to vanish in a matter
of 1-2 years. There used to be a significant number of NVIDIA Geforce
MX420/440s, 5200s and 6200s. Then much fewer 730s. Now 1030s are practically
only in laptops. Intel has been nibbling away at this market slowly, but
steadly for a decade.
~~~
llukas
If driving FHD display is all that you want then integrated GPU is fine. But
we start getting 4k/UHD nowadays...
------
payne92
Finally! Intel has a lot of catching up to do.
As GPUs continue to evolve into general purpose vector supercomputers, and as
ML/deep learning applications emerge, it seems clear that more and more future
chip real estate (and power) will go to those compute units, not the x86 core
in the corner orchestrating things.
~~~
rodgerd
> Finally
Why on earth would you think Intel extending their near-monopoly is a thing to
celebrate?
~~~
dsr_
They don't have a monopoly in add-on GPUs. There are two strong competitors,
and Intel's current on-chip GPUs are comparatively pitiful.
------
shmerl
_> With his hire, Intel will be developing their own high-end discrete GPUs._
With Intel and AMD backing Mesa, things on Linux will get very interesting.
~~~
madez
I am sceptical about the consequences for user-controlled computing. AMDs GPUs
have made a positive development in the past, while Intel is unfriendly to
users control over the hardware they buy.
~~~
shmerl
Intel GPU drivers are open on Linux. How is that worse than what AMD are
doing?
~~~
madez
Intel does more than integrated GPUs.
~~~
shmerl
AMD too, but were are talking about GPUs in this case.
~~~
madez
Yes, and because of the above mentioned, I am sceptical about the
consequences. I don't consider it likely that the new GPU will work without
proprietary firmware nor that the documentation will be better than AMDs now.
~~~
shmerl
AMD GPUs also need firmware unfortunately.
------
artellectual
Damn, this is a major loss for AMD, losing Raja is definitely not the right
move. It would have been interesting to see the next iteration of AMD graphics
with Raja on board.
Threadripper, and the Zen architecture, put them back on the map, that’s some
serious hardware for the price. I wish they had just kept iterating on the
CPUs and GPUs.
Vega is not a bad product, it just doesn’t beat nvidia’s offering in the bar
charts, doesn’t mean it’s bad, it just means it’s second place which is fine
since it’s cheaper as well. Technology needs to be iterated on. Something must
be going on at AMD at the moment.
------
cameronhowe
Can someone explain this to me: Isn't the GPU industry all about Patents and
trade secrets (enforced by NDAs). Won't all Rajas expertise be tied up in
that?
~~~
ChuckMcM
Intel has a both a large patent portfolio and a lot of legal firepower in that
space, so no, I don't think folks like Nvidia will be able to "threaten" Intel
with patents. Nvidia might be able to threaten them with the monopoly card
(clearly Intel is using its dominance in desktops and laptops to move into an
adjacent market) but they have been doing that for many years with the
integrated GPUs so I would expect it to be a weak play.
~~~
cameronhowe
I must have expressed myself poorly. He's coming from AMD/RTG, everything he
knows is presumably what they use/own. Nothing to do with NVIDIA
~~~
ChuckMcM
No, I understood, I just called out Nvidia because they are so often on the
other side of a patent dispute with Intel. AMD and Intel have broad cross
patent licensing deals in place because of previous fights over patents on the
frontside bus, the instruction set, etc.
From a strategic markets point of view I see it this way;
Discrete GPUs gives Intel a shot of owning both pieces of high margin silicon
in a laptop / tablet design win (GPU & CPU) and potentially it gives Intel
additional ammunition to go after Nvidia or to mitigate their encroaching.
------
40acres
Intel is all in on becoming a "data company", with the recent design wins in
self driving cars & the AMD deal I'm confident that they will come out of the
AI HW race in strong shape. This move just reaffirms that.
------
ohyes
It isn't clear that AMD's GPU architecture has been really competing with
nvidia. We'll have to see how big a deal this is when AMD's APUs come out. I
expect them to be quite a bit better than intels integrated product.
This seems to be more of an direct competitive attack on AMDs integrated
product than it is competition with nvidia. It feels to me like building
discrete GPUs is almost a misdirection.
------
zachruss92
An interesting counterpoint here. I have a friend who works for Intel as an
algorithms engineer for their self-driving vehicle acquisition (Mobileye).
Currently, he's using 2 1080TIs w/ TensorFlow to perform deep learning. It is
possible that Intel could be looking to develop a chip used specifically for
this purpose (a bet on self driving cars) and not for mass-production/sale
outside of that tech. Either way, all of the GPU/CPU updates in the past year
is just going to create more competition, which is better for the consumer in
most cases.
~~~
p1esk
Well, the whole point of Mobileye acquisition was for Intel to have a
competing chip for autonomous cars. But it is possible that they are also
looking to compete on 1080Ti level. Which would be very hard.
------
farhanhubble
Nvidia is lightyears ahead in the GPU market besides if this GPU push is aimed
at the Deep Learning market Intel will have competition from the likes of
Xilinx too. IMHO they need to provide great software to go with their GPUs.
Traditionally hardware manufacturers have shipped barely usable software. They
should perhaps try to use opencl and keep the rest of the tools and libraries
open source.
~~~
deepGem
This is what many people outside the AI world don’t seem to understand. Nvidia
has a stranglehold in the form of CUDA and Cudnn. There isn’t any open source
equivalent to Cudnn. AMD is trying to push OpenCl in this direction but it
will be a long time before DL libraries start migrating to OpenCl. Like
tomorrow by miracle if al alternative GPU which is as good as the 1080ti
popped up, it would be useless in the AI market.
~~~
SJetKaran
No it won't. Especially if the price is competitive. Say for the price of 1
1080ti, if i can buy 1.5 units of comparable performance graphics card, i'll
surely buy it. There are already resources being spent on OpenCI based ML/DL
platforms
([https://github.com/plaidml/plaidml](https://github.com/plaidml/plaidml)).
The architectures keep getting bigger and training time keeps getting longer.
I think you underestimate this factor. I need as much gpu computing power as i
can buy within the budget.
~~~
deepGem
True, I would love for some alternatives such as plaidml. However, I can't
seem to fathom that Plaidml will be a worthy alternative to lets say
Tensorflow or Pytorch or Caffe. I hope I am proven wrong.
~~~
SJetKaran
I think support for OpenCI will eventually come to other frameworks. But the
main problem is the fact that AMD is still far from the NVidia in terms of
performance. Vega couldn't reach the performance of 1080Ti, and with Volta
next year, the gap is gonna drastically increase. If only AMD can fill the
gap, the support, i'm sure, will soon come after that.
------
gbrown_
It will be interesting to see how "discrete" thses GPUs will be. I'm assuming
they will only be "descrete" in the sense that they are not on the same chip,
but rather same package (via EMIB).
Either way surely this is a move by Intel to take away from Nvidia's consumer
share (which makes up the vast majority of their income) as Nvidia make
inroads into the data center market?
~~~
wyldfire
The big win that discrete GPUs provide to the cloud/backend marketplace (that
Intel sorta plays in via Xeon Phi) is from large banks of VERY fast memory
coupled with fast-clocked vector processors. But without a bunch of HBM or
something similar, the discrete GPU won't be able to do training at the scale
that NVIDIA and AMD do.
~~~
gbrown_
One would assume that in the data center for discrete cards Intel would do
something with their Nervana acquisition and HBM, or possibly (but less
likely) MCDRAM.
------
chucky_z
I don't know too much about Raja Koduri, but is leaving AMD and immediately
joining Intel not... really shady?
~~~
Maskawanian
How is it shady to change jobs? This notion that you should be jobless for a
period of time is ridiculous.
~~~
mathperson
they are companies directly competing...it is very possible AMD engineer could
bring proprietary information to INTEL. as in this literally recently happened
and received massive press coverage with UBER and WAYMO- resulted in intense
legal action.
~~~
awalton
The fact is that people have to have job mobility, and need to be trusted that
when they leave a company, they leave behind that company's secrets. Many
companies make you sign a document that attests this: if you have any company
data, you destroy it, if you have any company equipment, you return it, if you
have any company knowledge, you forget or neglect to discuss it.
Most people, honest people, have no problems understanding these obligations
and abiding.
Dishonest people, who lie about destroying documents, are why we have Uber and
Waymo battling it out.
~~~
userbinator
_if you have any company knowledge, you forget or neglect to discuss it._
Given how the human brain works, that's very much impossible to do...
"standing on the shoulders of giants" and all that, as the saying goes.
I'm sure some companies would love to be able to "reformat" employee's brains
when they leave, but (fortunately) that's not the reality.
~~~
ams6110
> standing on the shoulders of giants
Of course. No question that you take the sum of your education and experience
with you to each new job. The "company knowledge" limitations are around
specific trade secret inventions or verbatim recreation of such.
------
LoSboccacc
remember when project larrabee could raytrace quake in real time? hope this
will be another stab at an hybrid gpgpu
------
andreiw
Fool me once, shame on you. Fool me twice, shame on me.
Unclear what AMD thought they stood to gain with the Monday announcement - and
it didn't take long to have it play out in their disfavor.
I'm guessing Intel's GPU will never support OpenPower and Arm servers, and
will never ship on a CCIX-enabled adapter.
------
throwaway613834
Can someone explain how I'm supposed to interpret this along with the other
recent article on Intel & AMD creating a joint chip of some sort? Are they
competing or cooperating?
~~~
djrogers
They’re not creating ‘a joint chip of some sort’ - AMD will be selling GPUs to
Intel who will package them with their CPUs via EMIB.
[1]
[https://www.intel.com/content/www/us/en/foundry/emib.html](https://www.intel.com/content/www/us/en/foundry/emib.html)
------
wyldfire
Also here:
[https://news.ycombinator.com/item?id=15651848](https://news.ycombinator.com/item?id=15651848)
------
mc32
Since the days of Chips and Technology, intel has vacillated between going the
full hog and retracting from GPUs.
Wonder if this time they will stick with it for the long haul.
------
perseusprime11
I am not sure everyone here knows about Raja. He is a talent at a totally
different level. Big loss for AMD. AMD should have done all it could to keep
him.
------
alextooter
Intel pick the wrong guy and wrong path.It just don't work.
I think Intel should acquisition Nvidia, and let Jen-Hsun Huang lead the new
company.
------
krisives
Brings back memories of Larabee their last attempt at making a GPU before they
scrapped the project and wasted everyone’s time.
~~~
pjmlp
I was on the room at GDCE 2009, where they were praising the vector
instructions while presenting a session on Larabee.
------
abiox
intel once tried to do this with larrabee[0] some years back. hopefully they
learn from what went wrong there.
[0]
[https://en.wikipedia.org/wiki/Larrabee_(microarchitecture)](https://en.wikipedia.org/wiki/Larrabee_\(microarchitecture\))
~~~
awalton
Larrabee was an attempt to see if the x86 architecture could power a GPU. The
answer was "not very likely", but it got turned into a product of its own
anyways because it turned out to be very interesting for other compute-heavy
use-cases. Larrabee's descendents became "Knights", which became the Xeon Phi
product line.
Keep in mind Intel currently builds GPUs - just of the integrated variety.
What's new here is that Intel is deciding to build discrete (standalone, like
those you'd plug into a PCIe port) GPUs.
~~~
craftyguy
It was definitely being targeted to compete with other discrete graphics
products, and at some point in the program they figured out that they would
never meet the performance necessary to compere effectively. So in order to
not have completely wasted several years of development, it was re-purposed as
a product targeting HPC (the first generation Knights/Xeon Phi product)
~~~
awalton
Intel really doesn't mind "wasting" time on innovation - they make tens of
billions of dollars a year and they're on top of the market. They can afford
to go down blind avenues, especially when the research spills out so well, as
it did in this case.
It definitely wasn't a "saving throw" that Larrabee's architecture got
repurposed. There were several teams at Intel working in similar directions -
one team worked on a "cloud on a chip", one team worked on high bandwidth
chip-to-chip interconnects, one team worked on on-chip networking... they all
came together and formed the Knights Ferry research project, which then got
turned into the Xeon Phi.
The "core" of Larrabee, its quick little Pentium-derivatives, went on to be
repurposed in the Quark product line and its lineage (e.g. the Intel PCH has a
"Quark" inside). The 512-bit instruction set got parted out and became AVX512
in is various incarnations. They definitely got their money's worth out of
Larrabee.
Nobody is disagreeing with the fact that Larrabee didn't turn into a discrete
GPU despite their attempts make it so. (It's also not surprising, seeing the
carriage turn back into a pumpkin with Cell and other Many Core architectures
fail to pan out to be good at graphics workloads). But that's a separate issue
from Intel building GPUs, since they have a _completely other team_ that works
on building productized and shipped GPUs.
------
microcolonel
Wow, this might save Intel. They are floundering in the server market right
now because they won't put enough PCIe lanes on their platforms, because that
means lower sales. If they can grow up in the GPU market, then that means they
basically win over AMD's latest maneuvers.
------
dis-sys
This again shows AMD is not ready for the battle with Intel.
Ryzen's chief architect left in 2015, now the master mind behind its GPU is
leaving. You need to be really religious to believe that AMD is going to get
any better in the coming competitions with NVIDIA and Intel.
------
reiichiroh
Ah, they sort of half-assed in trying with Real 3D. Was that Larrabee?
------
nilsocket
Never mind, for me it seems raj has too much of higher expectations, which
makes him a bad hire. Hope AMD finds someone committed and a real enthusiast
to do the job.
------
CrunchGo
Nice! this will be a great win for Intel
------
cpatil
Interesting timing for this announcement, given Nvidia earnings tomorrow.
Looks like intel is back its underhanded shenanigans.
------
m3kw9
Trojan horse 101 lesson for AMD
------
yalogin
What is a discrete GPU?
~~~
jaas
A discrete GPU is a GPU that's not on-die with the CPU. A discrete GPU is
usually something you stick in a PCIe slot.
A GPU on the CPU die, non-discrete, is often referred to as an "integrated
GPU" or "integrated graphics." They're typically not very powerful, though
they run common non-gaming applications just fine.
------
lonk
Nvidia and ATI should develop x86 cpus for the balance's sake.
------
mariusmg
Larabee reloaded.
------
user982
I can't wait for the twist in tomorrow's episode.
------
rurban
What? Is there no noncompete clause? Strange
------
gigatexal
what a coup this was.
Raja: "I'm...um...going on sabbatical." Lisa (CEO): "OK." Intel: "We're hiring
Raja!!..." Lisa: "WTF".
~~~
fermienrico
At the risk of turning HN into Reddit, I'd like to politely suggest to keep
jokes, puns, and other shenanigans off of Hacker News. If you have nothing
constructive to add to the discussion, please refrain from commenting. Thank
you :)
~~~
gigatexal
why do we all have to be so stoic and serious? can't we have a bit of fun?
~~~
fermienrico
Most people on HN are not looking for fun. They're here for information,
intelligent discussion, and constructive criticism.
Use Reddit for fun! There is plenty of fun on the internet. Like Reddit,
hackers don't want [Serious] tag.
| {
"pile_set_name": "HackerNews"
} |
A CQRS Implementation with ASP.Net MVC and Entity Framework - harisb2012
http://web-matters.blogspot.com/2014/08/cqrs-with-aspnet-mvc-entity-framework.html
======
emp_
Dapper was a great fit when we had a pretty ugly Entity Framework 1.1 (eek!)
build a few years ago, so all the heavy duty of writing and handling entities
trees was left to the ORM and pure SQL queries materialized by Dapper were
used for consumption.
~~~
teh_klev
Dapper is excellent, I use it everywhere now. I've tried everything over the
years from .NET 1.0 ADO constructs to LINQ (which I quite liked) to EF to
NHibernate, I could go on. But nothing comes close to Dapper for speed and
simplicity.
| {
"pile_set_name": "HackerNews"
} |
Ask HN: I want to learn how to code. Can anyone tell me how to start learning? - theprotagonist
Greetings,<p>I am a recent college graduate (as of this past May) and I studied chemistry and physics. I have plans to go to graduate school within a year for theoretical and computational chemical physics - that's basically fancy talk for predicting physical interactions between molecules using high performance computing. I have some experience in the field from working as a research assistant as an undergrad and while I never had any problems with the physics, I struggled a little bit with the coding aspect because of lack of previous exposure.<p>I also read this post "To founders who can't code"<p>http://news.ycombinator.com/item?id=1761530<p>It really hit home. I have had two failed ventures thus far which likely would have taken off better if I had the coding abilities to make my own demo. Instead I took the seed funding I received and hired people to do it for me - long story short things didn't work out; I lost money and disappointed people.<p>The author of the aforementioned post recommends that noncoder founders should: "Take 6 months off and go learn how to code (day and night, weekends including)."
This is what I would like to do but I am unsure as to how to begin. Sure, there is a wealth of sources but I am unsure of which ones are quality sources and also what languages to learn first. I am not trying to be a coder or hacker overnight and my approach is methodical: I will devote 8-10 hours per week to learning. I generally pick up things fast; the key though, is having interesting problems to solve which increase in order of difficulty.<p>If anyone can give me a few pointers on how I can start learning (what resources are good, what language to begin with, a good program of studies) that would be much appreciated as it would help me develop my future ideas on my own and would probably also help with my research. I thought a cool initial project would be an applet which queries each line of a word document with book titles (I keep a running list of all the books I’ve read, one per line in a doc) and searches the net for a picture of a book cover and imports it into the app. Apologies for my ignorance but would this be a realistic project within 6 months?<p>Lastly, I did run a search on this but didn’t find any related threads; I apologize if this Q has come up before.<p>Thanks for your advice,
AKD
======
zedshaw
I wrote this:
<http://learnpythonthehardway.org/book/>
It's free. Do this:
1\. Use your current computer. It doesn't matter if you have Linux, OSX, or
Windows. What matters is that, right now, you want to learn to code, so you
should go learn to code, not learn to setup a new OS.
2\. Just use gedit. Don't use vim, vi, emacs, or any "hardcore" editor. On a
Mac if you're using a non-English keyboard, use Textwrangler. Learning a new
editor is _not_ learning to code.
3\. Start now, do what I tell you in the book. Type code in, do _not_ copy-
paste, make it run, fix it until it does, do the extra credit, then go on to
the next one.
4\. Other programmers will tell you to use their favorite tools, just ignore
them. Just use gedit, Terminal (cmd on Windows), and python. That is all.
Nothing else. Everything else is a distraction.
5\. Finally, do it every night, for 2 hours a night, and take a break on one
day. You'll be surprised how quick you can get through the book, and you'll
get stuck sometimes, but keep doing it.
After that, move on to other more advanced topics and try to learn more stuff,
but for now, just do this.
~~~
creativeone
Zed, thanks for the book, I just got started with it.
I have heard that when done with the book, I still won't be able to program in
Python. Can you elaborate on that?
Also, what do you recommend after completing your book? I'm interested in
building web apps.
~~~
shareme
Zed's right start with a non-IDE solution such as using a basic editor..as it
forces you to think and get at the object-oriented abstraction of the
programming language you are attempting to learn.
~~~
theprotagonist
Zed thank you, someone posted a link to your site last night and I think it is
very right for me. I will use it.
As per the Windows vs Linux debate - I have found most physics labs use either
various Linux distros (Red Hat is super popular) or OSX. I myself have a
MacBook Pro running OSX and so I just downloaded and will be using TextWranger
as an editor to complete the lessons.
------
emilepetrone
I was in your shoes 11 months ago, so let me give you some real-world advice.
If you have a friend that knows any language (Python, Ruby, PHP, etc) start
there. They will be there when you have questions, and having a person to turn
to is the most important thing. However lets say you don't have a friend to
turn to- start wit h Python.
Try Google App Engine to get started so you don't have to worry about dealing
with a server. Start here <http://learnpythonthehardway.org/>
Find a copy of <http://oreilly.com/catalog/9780596801601#toc> for
understanding basic ideas with HTML, CSS, JS and Python/GAE.
In the meantime, start a blog and write about your stuggles and the things you
learn. Initially it won't really be about coding - but the main thing is
showing momentum to the outside world. (Trolls will hate just ignore). By
getting your name out there, people will be more interested in helping you.
Start there. If you have questions, my email is emile (at)
------
tjr
I think it best to learn how to code by working on an actual project. You seem
to already have a project in mind, and I think that would be a fine project.
Break it up into steps. First write a program that opens a file, reads the
book titles and displays them on the terminal. Then write a program that makes
some sort of network connection; maybe download the HTML contents of a
website. Etc., etc., until you've learned how to do the various subtasks
involved with your project, and then assemble it all together.
I'd suggest that Python would be a good language to start with. I personally
like the O'Reilly book _Learning Python_ , though there are many options.
~~~
stevenp
Agreed. I'm self-taught and have been working in the industry for 12 years
now. The way I got started was that I had a project I needed to do, so I
figured out what I needed to know to get it done. The same is true with my
iPhone app -- I knew nothing about Objective-C when I started it, and it's 3
years old now. Learning for the sake of learning is fine, but learning how to
build something you want to build and really seeing it come to life is a
pretty phenomenal way to learn.
~~~
conradev
I also agree. I tried reading a book on Objective-C, read the first two
chapters, and ditched it for the iPhone SDK. I messed around, and tried making
some ideas come to life. It took me a couple months, but eventually I was
fluent in Objective-C.
------
wvoq
Hi. Most of the advice in the comments already posted is sound, but none of
them seemed to address HPC. Working on a cluster invites an entirely different
bundle of conceptual and practical hurdles (e.g. parallelism, working
remotely, industrial-strength shell scripting &c.) Even though I had been
programming since I was a kid, I found my crash-course in HPC to be quite
challenging; confronted with a new programming model in a new low-level
language, it was the first time that I really appreciated what it must be like
not to know how to program at all.
So: code as much as possible. As soon as you can possibly stand it, look into
MPI4py and start parallelizing your code. Chances are you won't be working
with python in HPC contexts, but learning parallel programming at the same
time as C or fortran would be needlessly difficult. I would also try to get
time on a cluster as soon as you're in a position to use it respectably. Most
universities with HPC facilities have an online application for an account,
and some sysadmin might take pity on you :) Otherwise, maybe Amazon has some
kind of deal?
In the interim, become _very_ comfortable with bash and general command line
fu, and a serious text editor. Good luck!
~~~
wylie
Do you think that learning about parallelism would help a beginner to learn
how to program, or to understand code better? I consider myself an
intermediate programmer, with about five years of experience, but have never
explored HPC. Not sure it would have helped me more to have done so as a
beginner.
~~~
wvoq
I'm not sure, really. In Saeed Dehnadi's article "The camel has two humps",
it's mentioned that there are three great filters in programming pedagogy:
* assignment/sequence
* recursion
* concurrency.
Most students never master the first, and most of those never master the
second, and so on. In that spirit, I would recommend that the OP firmly grasp
the first two with both hands before reaching for the third. Of the languages
the OP could study that treat concurrency or parallelism as a kind of
conceptual primitive (e.g. Scala or Erlang), none are likely to appear as
working languages in an HPC milieu. Almost always, the libraries are bolted on
as an afterthought to traditionally popular languages for scientific
computing.
I think this is a regrettable pattern, but a pattern which will shape the OP's
daily work if they wish to begin (and remain in) a program in computational
chemistry.
All I meant to suggest is that when the OP begins to learn parallel computing,
they want to be thinking about parallel algorithms, not segfaults or pointer
purgatory or the finer points of scp or vi. The sooner the former can be
mentally sublimed, the better the OP will feel about HPC.
------
SIK
I am in the process of doing exactly this.
If your ventures are web apps, my recommendation would be to learn Ruby on
Rails. You will be able to build demo apps within a few months of 8-10 hours
per week.
I started with RailsTutorial.org, which is a free book that will take you from
installing Ruby to building a twitter clone. From there, get a few books, I
recommend Agile development with rails and The Rails 3 Way, and continue to
work on some smaller apps. There are also great screencasts you can find by
searching for "railscasts."
If you are using windows computers, install Ubuntu Linux which is really easy
with Wubi. I have found it makes things easier. If you have a mac, stick with
it.
For text editor, I use Sublime Text 2, and if you're on mac, just go with
Textmate.
Sign up for Github and learn about version control. Also, go through projects
on Github and learn by reading other people's code.
After you have a semi-grasp of the basics, start building something
substantial.
Search Stack Overflow when you have questions, and if you've been trying to
figure something out for over six hours, ask a question on Stack Overflow.
For html and css questions, I generally just google any issues I have and fool
around in firebug, which is a firefox extension that lets you edit html or css
and see the changes on your screen.
Best of luck!
------
JacobIrwin
You need to take a bottom up approach. Fill your time studying the very basics
(i.e., programming languages, components, specifications, history, etc.).
Online video lectures [beginning with] Programming Languages was where I ended
up learning the greatest amount in the first weeks.
I started with MIT OpenCourseWare. I was very fortunate to find this lecture
series:
[http://www.youtube.com/edu?edu_search_query=intro+computer+p...](http://www.youtube.com/edu?edu_search_query=intro+computer+programming&action_search=1)
because it is class taught at MIT for students entering the CS or engineering
programs that have little or no background in Programming Languages.
Hint: Pay attention in the first and second videos close enough and you'll
learn how to locate online resources that are provided to the enrolled
students.
------
cipherpunk
Grab yourself a copy of Racket (<http://racket-lang.org>) [also, the
quickstart guide at <http://docs.racket-lang.org/quick/index.html> might be
useful], the How to Design Programs textbook
(<http://htdp.org/2003-09-26/Book/>), and most importantly, set aside some
solid blocks of time to dive in. You will learn most by doing, and through
doing you will gain understanding.
------
akulbe
By coding.
That may come across like a smart-ass answer, but it's not. I'm in the same
boat, and I'm learning the same way. I've got a Mac dev environment, and Linux
dev environment. I'm using a book by Stephen Kochan, and another by Dave Mark.
We're _surrounded_ by a plethora of materials to help us learn.
Just do it! Pick one up and read, and write some code. Stumble through. You
_will_ make mistakes.
See emilepetrone and @housefed for a good example of this. He posts on here
all the time. He's only been coding for a year, and has a functional website.
------
waynecolvin
Python should be a reasonable first language but you might have need for
others later. Use a simpler editor at first so you can concentrate on coding,
not working the editor itself. The editor should be able to show line numbers
so when your program reports errors on a line number you can find it. Syntax
highlighting is a plus. Code most everyday (take some breaks) but please think
your problem through before committing to a solution! Be sure to read code
from others to pick up tips/style. Work through some books or something. You
don't have to be a top-notch expert all at once. I think the Perl community
says it's okay to just use the parts of the language you understand until you
learn more! (However becoming fluent in gritty details will make things
smoother when you don't need to look up mechanics as often. A musician needs
to know their scales.) Try rewriting old projects when you get better. That
implies saving your work somehow. Be sure to learn how to create modular
libraries and to use libraries written by others! You can't put everything
into one source file unless it's something simple and there's useful
functionality to be found.
------
bfung
I've been in the process of teaching/pointing in a general direction a couple
of friends on how to program. "Learn Python the Hard Way" was far too boring,
and wasn't practical enough for my friends.
If you're used to and good at learning in a school format, MIT OpenCourseWare
is excellent. My friends also liked the videos better than learning by book.
[http://ocw.mit.edu/courses/electrical-engineering-and-
comput...](http://ocw.mit.edu/courses/electrical-engineering-and-computer-
science/)
I would go in this general order for beginners. Do all the assignments, and
don't cheat. Ask for help on explaining the solution. These courses help in
building good fundamentals, then apply what you learn to a personal project.
1. 6.00 - Introduction to CS and Programming (Python)
2. One of: 6.087, 6.092, or 6.096&6.088, (C, Java, C++) respectively.
For scientific computing, pick C and/or C++.
3. 6.046J - Introduction to Algorithms
From there, 6.001, and perhaps the database course (I think experiencing
databases is much easier that a taught course)
Doing #1 already goes a long way to your proposed project.
------
kaptain
Look for a mentor who is already working on a real-world project. The problem
with learning things on your own is that the examples from which you are self-
learning are (by design) too simple and often unrelated to a result that you
want.
That's not to say the tutorial/source would be completely unrelated, but it
sounds like from your post, that coding isn't something that comes easily
which means that in order to make the leap to something that you actually want
to do, you'll need more than a book to guide your way.
The great thing about a mentor is that they (should) will have a real life
problem for you to solve and they can help you work through some of the
subtleties of the problem.
I used to think that anyone can learn to code (this was when I was 18). I
find, now, that there are people that are more naturally inclined to it and
some people that will never be able to, because their brains don't work that
way. I would gently encourage you to be open to discovering where you lie on
that spectrum and not to be too disappointed if you've tried but still find
yourself swimming in molasses.
------
podopie
The best advice I can give you is to write as much as you can before you start
coding. Putting a program or app together is 60% critical thinking, 20%
writing, and 20% error management. If you've already got some ideas, try to
break them down as much as possible.
As for specific languages to learn: startups in particular seem to love Ruby,
but honestly, they all work the same way, fundamentally. I crashed through the
basics of Ruby in a week, started learning JavaScript, and stopped shortly
after realizing that the majority of it was the same code just written
differently.
++ to keeping a journal. It keeps you in check, because even though you say
will will devote so much time a week to it, you won't. I had to start setting
personal goals on a daily basis. That drive alone is helping. Keeping a blog
is great for peer support too. I don't get many comments on mine, but it
definitely feels good when someone stops by to say, "Hey, this is cool stuff."
Hope that helps, and good luck!
------
ISeemToBeAVerb
I'm new to coding as well, but so far what I've gathered is that WHAT language
you learn is less important than actually sitting down and getting dirty with
code.
When I was trying to decide what to learn I narrowed the search down by just
heading to the book store and flipping through some books on various
languages. Ultimately, I ended up with a choice between Ruby and Python. I
couldn't tell what the major differences were, so I just decided to pick Ruby.
I figured that there was no real way of recognizing the nuances of ANY
language until I actually had one under my belt and could better understand
what makes each one tick.
I'm fully aware that this was a somewhat cavalier method of choosing the pal
I'd be spending the majority of my waking hours with, but I think that
starting anywhere is better than stalling because you can't decide.
So far I'm satisfied with my choice. I think I would have been satisfied with
Python too.
------
chubs
If you end up choosing python, a popular first book is:
<http://learnpythonthehardway.org/>
A lot depends on which language is going to be helpful in your computational
physics class: you should find out which language they'll expect you to work
in.
------
coryl
I'm actually 2 months into learning to program Javascript for the Unity3D
engine. I've learned a lot in 8 weeks, and I'm happy I took the time to commit
to it. I could actually start prototyping my own basic mobile games with
Unity, something I never thought I'd be able to do.
So I guess from my experience: \- Find a single good source or book for
tutorials and learning. Its best for the coding style and teaching style to be
consistent. I used a fantastic series of free video tutorials produced by the
Walker Brothers, which included 3 entry exams, and a series of lab assignments
after each tutorial set. I had to submit the work in order to get access to
the next set.
\- Find a good Q&A source like Stackoverflow, forums, or a site more specific
to what you're coding in. When you get stuck or don't understand something, go
and ask (search first). +1 if you have programming friends to ask too.
\- Keep a journal (really!) on Google docs. At the end of the night, you can
quickly re-hash what you learned (cs concepts, or cool functions you learned),
or often write out the things you don't understand. That way, when you start
up again, you can do a quick review on where you left off and get back to
figuring out things you previously were stuck on.
\- Try not to skip past things you don't understand. If you don't understand
them, take the time to practice out the code, or look up documentation. A big
key point: its always worth it to invest the time to figure little problems
out. I once spent 3 hours trying to get some timer controllers working just
perfectly the way I wanted. They were actually OKAY to begin with and I was
considering skipping past it for the sake of productivity, but in hammering
out the problem, I gained confidence and had the satisfaction of solving a
problem.
\- Get your things WORKING! As beginner programmers, our first concern is
making what we want happen. Not pretty code, not computer science theory. Just
results. Getting results fast gives you the confidence to try harder things,
which will naturally take you into the world of organized code and computer
science.
Good luck!
------
theprotagonist
Lots of informative advice, I am very grateful to all of you. Thanks again. I
think I will go with Python or Ruby after I check out the resources in the
links that were posted. I plan to build my knowledge but as a person with a
science background, I'm happiest when I'm solving a problem so I definitively
see the merit in working on the project whist learning.
I don't think I have any further questions - I got loads more helpful advice
than I thought I would ever get and again, I'm very grateful. The only couple
things I feel are worth mentioning is that I am using a Mac and I have
actually written a couple subroutine packages for HPC in FORTRAN but they are
nothing too special as FORTRAN syntax is very simplistic.
~~~
cipherpunk
I would also suggest one more thing: <http://mitpress.mit.edu/SICM/>
Since you say you have dabbled with physics, you may find this sort of method
useful.
------
espeed
Set up a Linux computer (Ubuntu Linux is the easiest to set up), and spend the
summer learning to program in Python.
Python is easy to learn (not much syntax), easy to read (explicit vs
implicit), has a big ecosystem (more packages/libraries), is taught at
universities so it's easy to find good programmers to help, and is used by
many large websites/companies so it's a good language to know.
Here are some of the best online Python tutorials, including a link to videos
and course material for MIT's introductory computer science course, which uses
Python: [http://www.quora.com/How-can-I-learn-to-program-in-
Python/an...](http://www.quora.com/How-can-I-learn-to-program-in-
Python/answer/James-Thornton)
Build something that you want to use so it will be meaningful to you. Do you
have a blog? That's usually a good first exercise. It's easy to do using Flask
-- follow the tutorial (<http://flask.pocoo.org/docs/>).
Here are some tips to get you started:
Use Emacs as the text editor to write your code -- it usually comes pre-
installed on Ubuntu, and it has a Python mode. Here are some Emacs tutorials
(there are some good videos on YouTube too):
[http://philip.greenspun.com/teaching/manuals/usermanual/emac...](http://philip.greenspun.com/teaching/manuals/usermanual/emacs.html)
[http://www2.lib.uchicago.edu/keith/tcl-course/emacs-
tutorial...](http://www2.lib.uchicago.edu/keith/tcl-course/emacs-
tutorial.html) <http://www.gnu.org/software/emacs/tour/>
<http://cmgm.stanford.edu/classes/unix/emacs.html>
Use PostgreSQL as your database. To install it on Ubuntu, use this command:
$ sudo apt-get install postgresql
Use SQLAlchemy (<http://www.sqlalchemy.org/>) to connect your Python website
to PostgreSQL.
Here's a good SQL tutorial: <http://philip.greenspun.com/sql/>
When you build a blog, you don't have to worry about building a public
authentication and comment system if you use something like Disqus
(<http://disqus.com/>) -- you just include the Disqus JavaScript tag at the
bottom of the blog's entry page.
Here are some good JavaScript tutorials: [http://www.quora.com/What-are-good-
books-preferably-found-on...](http://www.quora.com/What-are-good-books-
preferably-found-online-for-free-like-eloquent-javascript-for-learning-
javascript)
Use StackOverflow to ask programming questions: <http://stackoverflow.com/>
UPDATE: Here are links to some commonly-used scientific Python packages
(<http://www.drewconway.com/zia/?p=204>).
~~~
wisty
The Python Tutorial (<http://docs.python.org/tutorial/>) is really pretty
good.
Now, if you are doing more researchy work, then a lot of the web stuff is
peripheral.
My advice would be to use scipy (the swiss army knife of scientific
programming with python), matplotlib (for 2D plots), _something_ for 3D (maybe
Mayavi2?), NetworkX for networks, PyTables for storing read-only data, the
inbuilt csv library, ctypes or weave for performance ... and domain specific
libraries here: <http://www.scipy.org/Topical_Software>
But don't worry about all that yet. You can hack together a good demo with
nothing but scipy and matplotlib.
------
brudgers
My advice, start with javascript:
1\. Learning C-like syntax provides a basis for reading a lot of other code.
2\. Javascript examples can be seen on any webpage simply by switching to
developer mode.
3\. The javascript console allows experimenting with code while you read about
programming on a webpage without leaving your browser.
4\. It is perhaps the most widely used programming language currently.
5\. Even if you are not interested in the DOM and webpages, there are still
interesting exercises you can do in Javascript - I recommend project euler.
------
dfrankow
Whatever language you choose, consider trying a few problems from
<http://projecteuler.net>. The problems are small enough that you can feel the
reward of getting the right answer. I used Python.
As an experienced programmer with math background using a language I knew, a
problem took me 15-60 minutes. Not knowing a language or how to program, it
might take several hours, but not weeks.
~~~
AlexMcP
I'd second this notion, but with the added caveat that without a 'mathy' way
to solve the problem, you can't get much beyond the first 15 or so (if you're
good at math, great, if you just want to put down some code to solve a simple
problem... I got frustrated :) )
------
aangjie
I really won't add too many links to all the comments here. But Given you
mention physics and some RA experience, i will say Haskell may not be as hard
as a few others think. (i.e. If my idea of what it takes to do complete
physics and/or chemistry graduation). I think in the end, you may have to
spend an hour or two for each of the links before taking a call on which you
prefer _.
_ \- Eeks, that looks eerily like common sense :-)
------
known
You need to learn about
https://secure.wikimedia.org/wikipedia/en/wiki/Apache_Subversion
https://secure.wikimedia.org/wikipedia/en/wiki/Debugger
https://secure.wikimedia.org/wikipedia/en/wiki/Profiling_%28computer_programming%29
and I believe <http://perldoc.perl.org/index-tutorials.html> is easy to learn
for a novice
------
trbecker
I was about to suggest you to "Learn You A Haskell For Great Good" :) Way too
complicated. Listen to the other guys that don't suggest Haskell. If you then
discover your calling in programming, go back and learn Haskell at your own
peril. Also try to stay away from LISP in your first lessons, and its ugliest
kid, emacs. Use a simpler text editor that won't twist your fingers. gedit
would be my suggestion.
------
mikeburrelljr
Rails for Zombies, of course!
<http://www.codeschool.com/courses/rails-for-zombies>
Nom nom nom...
------
matsimo
I don't mean to hijack the thread, but I'm curious; how did you go about
receiving seed funding (and enough to hire people) for a tech/web startup
while not having strong coding abilities yourself?
Did you hire people with the intent of having them build a demo for you?
And for other readers... Do either of these things happen much? Does it work?
------
franze
yesterday i showed my 12 year old kid (who, to my disappointment is more
interested in sport, hip hop dance lessons, music (tuba) and girls then in
code) <http://processing.org/> \- together we managed to write a drawing
program with a game component in about 2 hours (he typed....) even though the
language is in no way perfect or even pretty he finally "got" it.
so before starting with a small project, try to implement a one person tron /
snake game in processing - it helps you to start "codethink".
after that i would recommend reading a shitload of books. if you are into
mobile apps, try <http://ofps.oreilly.com/titles/9780596805784/> it's
basically a tutorial for making a simple web app in HTML, css, js with no
prior knowledge required.
------
DomainNoob
I would also mention the Lynda.com videos. I'm running one of their Perl
tutorials now working in Eclipse and finally seem to be getting somewhere. And
I really wish someone would put together an intro to programming series using
Yahoo Pipes as a platform.
------
creativeone
I am also in your shoes, and have started with
<http://learnpythonthehardway.org/> It seems to be one of the best ways to
learn Python.
------
ZaneClaes
I wrote an article on this subject a few weeks back. It is not a step-by-step
"how to write code," but rather a good way to understand the approach to self-
teaching yourself computer programming.
------
ryanbigg
I second the advice about setting up an Ubuntu machine. Windows is
unnecessarily painful for development in comparison. Although there are
"workarounds" around the problem, you'll find things much easier for
development on an Ubuntu or Mac computer.
Now for a language recommendation. I am a Ruby programmer, so I've got a
pretty heavy lean towards that.
Ruby is an exceptionally easy language to learn. There's a book called Learn
to Program written by Chris Pine (<http://pine.fm/LearnToProgram/>) which is
an amazing beginning to getting into Ruby.
Past that, there's the Well-Grounded Rubyist by David A. Black
(<http://manning.com/black2>) which covers all the things from basic Ruby up
to medium-advanced levels of Ruby). There's also Programming Ruby 1.9 by the
Pragmatic Programmers (<http://pragprog.com/titles/ruby3/programming-
ruby-1-9>)
If you want to brush up your Ruby skills, the Ruby Koans
(<http://rubykoans.com/>) are also pretty good.
If you're looking to get into web development (well, you ARE on the internet!)
then I would recommend learning HTML and CSS with a book such as HTML 5 and
CSS 3 by Brian Hogan (<http://pragprog.com/titles/bhh5/html5-and-css3>). Then
a good JavaScript book, perhaps something like JavaScript: The good parts
(<http://oreilly.com/catalog/9780596517748>).
After learning as much of those as you can, familarize yourself with Git by
reading the Pro Git book by Scott Chacon (<http://progit.org/>), or if you
choose another version control piece of software (Mercurial, Bazaar are good,
SVN isn't and CVS is (I'm pretty sure) the work of demons).
Ah and before I forget, I've got The Developers Code
(<http://www.thedeveloperscode.com/>) bookmarked for late-night reading and
I'm quite enjoying it so far. Quite a lot of lessons in there that I have
learned over my brief career, but ones I knew from the beginning.
One more final thing: you are new here and people will treat you like that. Be
nice to them and they will be plenty nice back. Respect the fact that they
have limited patience and may not wish to answer your questions eternally.
They may also have other people asking them questions at the same time you
are, or have other things they would like to be doing.
You will get better with practice. You show a keen want to learn, which is a
great start. Never give it up. Nothing is "too hard" forever. Persist, and for
the love of god, practice.
~~~
phamilton
While Ruby is a fantastic language and a joy to program in, if you are
planning on heavy computation, python is a much better route to go. While
everything can be tweaked and optimized in both languages, python is generally
favored for performance and ruby for flexibility. Numpy and Scipy are both
quite efficient and provide lots of functionality needed in scientific
computation. Down the road, if you have a script that takes days, weeks, or
even months to complete, python allows you to refactor the bottlenecks in your
code into C without too much difficulty.
I spent a year working in high performance computing at a university and so
much ugly and inefficient code was written in matlab. Had they written it in
python, it would have been much easier to optimize and their research would
have been much more productive.
~~~
troels
I don't think the raw computing power is that important in this case. And if
it was, there are better alternatives than Python.
However, I would suggest that Python is probably a better beginner/learning
language than Ruby. There are many odd ends in the Ruby language that might
easily trip an inexperienced coder up.
------
ignifero
Dude, coding is alot easier than physics. Stop procrastinating on HN, get a
Linux machine and start your first project. If you want to do web stuff, learn
PHP, otherwise use python. For your project you would need a library to
extract text from word documents and something like curl to query google.
------
bigwally
There is plenty of good material on where to start with programming. Most of
the links other people have provided here are very good.
A fairly good resource is Google Code University;
<http://code.google.com/edu/>
In particular you may want to start with Python basics;
[http://code.google.com/edu/languages/google-python-
class/ind...](http://code.google.com/edu/languages/google-python-
class/index.html)
------
georgieporgie
_interesting problems to solve which increase in order of difficulty._
Project Euler. Though the difficulty tends to sort of spike all over.
| {
"pile_set_name": "HackerNews"
} |
Touch Enabled Business Card - ari_elle
http://chasingtrons.com/main/2012/3/2/touch-enabled-business-card.html
======
huhtenberg
I like the idea and I want to steal it, but are there any options for super-
thin batteries? Something that would be literally paper-thin.
~~~
cyanoacry
Powerstream ( <http://www.powerstream.com/thin-primary-lithium.htm> ) makes a
number of paper-thin primary and rechargable cells. I haven't personally
worked with them, but the specs look incredible.
~~~
huhtenberg
Woah... 0.45mm!
~~~
maggit
That's amazingly thin!
However, it is still thicker than a regular business card [1], and it is
probably unrealistic to make the card exactly the same thickness as the
thickest part on it :(
[1]: Assuming 350g/m^2 paper (<http://en.wikipedia.org/wiki/Business_card>)
with a thickness just shy of 0.45mm
(<http://www.answerbag.com/q_view/2457344>)
~~~
thaumaturgy
FWIW I use plastic business cards which are quite a bit thicker, and pretty
effective. With 3D printers being what they are now, you might even be able to
print your own business cards around one of these thin batteries.
------
jayeshsalvi
Here is another good idea for Touch Business card
<http://www.youtube.com/watch?v=v3QVdMkg1cs> (More relevant to the purpose of
the card)
| {
"pile_set_name": "HackerNews"
} |
Some Chinese movie theaters are covering their screens in text messages - mimighost
http://www.theguardian.com/film/filmblog/2014/aug/20/chinese-cinemas-show-audiences-texts-alongside-film-wrst-idea-eva
======
mimighost
I posted this article just want to know why so called "bullet screen" video is
only popular in East Asia, like Japan and China? Any theory about why this is
not a big thing in the west?
------
venkatesh1017
They can do anything and in different way
| {
"pile_set_name": "HackerNews"
} |
What's wrong with cheap food - bensummers
http://smartpei.typepad.com/robert_patersons_weblog/2010/04/whats-wrong-with-cheap-food.html
======
jswinghammer
It'd be nice if the government just stayed out of all this so that people
could decide for themselves what food they wanted to support. Right now the
real costs are all hidden and incentives directed in all the wrong ways. The
food that is cheap isn't actually cheap it just looks that way because the
price signal has been so badly distorted.
I don't want a "smarter" farm policy. I'd like no farm policy. It's clear that
both parties don't actually care about this issue besides making sure that
agribusiness is happy.
~~~
senthil_rajasek
Food is tied to a nations security. If you haven't read Omnivore's Dilemma
([http://www.worldcat.org/title/omnivores-dilemma-a-natural-
hi...](http://www.worldcat.org/title/omnivores-dilemma-a-natural-history-of-
four-meals/oclc/62290639)) I highly recommend it. Michael Pollan, the author
makes a much better and convincing argument.
Imagine having no Farm Policy and we end up having food shortages, there would
be a revolution.
Personally, I try to buy/eat as much local organic food as I can and
experiment by growing food indoors.
~~~
byrneseyeview
_Imagine having no Farm Policy and we end up having food shortages, there
would be a revolution._
I can imagine. After all, this country has no Plate Policy, and I routinely
end up eating off the floor. And I really wish we had a Consumer Electronics
and Computers policy; I'm really jealous of the country that produced the
iPhone and the Macbook because of those attributes.
Fortunately, the country I live in _does_ have a Real Estate Policy and a
Banking System policy, and they're willing to put my money where their mouths
are! It's working out about as well as one would expect.
~~~
MartinCron
Pure capitalist libertarianism like that works really well if you live in a
society willing to stand back and let people die as a result of different
decisions or even circumstances outside their control.
History has shown that's not really the society that civilized humans want to
live in.
~~~
Dove
Pure capitalist libertarianism is a society where _government_ is willing to
stand back and let people die. Not society. There's a difference.
In time of trouble, you can turn for help to family, friends, neighbors,
private charity, or government. In roughly descending order of efficiency and
humanity.
Including government at the end of that chain is a noble attempt to make sure
no one falls through the cracks--like the federal reserve as a "lender of last
resort". The problem is that, like the federal reserve, the guarantee at the
end of the chain changes the dynamics at the front.
We respond to a government guarantee to care for the old and infirm by not
doing it ourselves. Why plan to care for your mother when the government will
do it? Why support private orphanages when you pay taxes to support
institutionalizing them?
The result is emaciated expectations of family, friends and neighbors, and
sickly private charity. Government may be inefficient and inhumane, but it's
cheap--spending someone else's money--and it's guaranteed. So it dominates the
space. And when the other options die away through disuse, it looks like there
_are_ no other options. And all that's left is dependence on the state.
The idea that it's government care or no care is an illusion created by a
couple generations of government dependence. I am convinced that our way --
institutions, beaurocrats -- is really the _inhumane_ way to care for the
poor. I want to get rid of it, not because I don't care, but because I do.
~~~
MartinCron
I like this concept, I really do, but I'm wondering if it's a sort of
idealized utopia. Are there any societies that take care of their most
desperate members without that being a government function?
Almost all of the industrialized nation have some sort of safety net and/or
universal health care. On the other side of the spectrum you get countries
like Somalia, where they, you know, _do_ let people die.
~~~
Dove
I don't honestly know the proper role for government in charity. I reject it
rhetorically to illustrate that rejection of government charity is not
rejection of charity overall, but I do not know if the extreme case is
practical. You do seem to be right that it is not practiced.
I had thought I would find counterexamples in places like the Antebellum
south, or ancient Rome, and while I found a reduced emphasis on the state, I
did not find it completely gone. And while my off the cuff memory is that
there are places in the world, even today, where the obligation for children
to care for their parents in old age is so complete that people have children
for that very purpose, I don't have a citation for it. And I really don't have
the experience to know whether it's preferable.
On a personal level, I see very vividly the evils of institution and
bureaucracy: the people it misses, the way it mistreats even the people it
helps. I have helped homeless friends with transitions from shelter to
shelter, even opened my home to them when it was appropriate; I don't think
highly of the treatment they get, well out of the public eye. And I'm pretty
sure if the city hadn't been there as an option, I'd have done more, and
they'd have been better off. Is it good the city was there? I guess. But I
hate the false dilemma that says it is the only way, and my instinct is that
family, friends, and community are far better.
I do not know what the correct role is for the state, but I cannot but hate
anything that weakens the responsibility of a man to care for himself, or that
seeks to replace the hospitality that family, friends and neighbors owe to
each other.
~~~
MartinCron
Thanks for the thoughtful response. A lot of people don't seem to like to look
to other countries as examples of how individuals, societies, and governments
can interact. Either as examples or as counter-examples.
NPR's excellent Planet Money podcast did a few shows on Denmark, which is like
an economic bizarro-world. Employment law is such that it's really easy to
fire people, but they have a very strong social safety net (2 years
unemployment, or something like that) so the workforce is very fluid.
Employers aren't as afraid to hire people (as they are in countries with
strong labor laws like France) because they don't have the same fear of legal
or interpersonal repercussions. Firing in Denmark is no big deal.
Taxes are progressive, crazy high, support fully subsidized health care, and
the people there are the happiest in the world.
Note: I'm not trying to say every country should be just like Denmark, but
just that the way we've always done things isn't the only way that can work.
On the other hand, you see people who complain about how "government can't do
anything right" and "all regulation is bad" yet they don't realize the only
reason that their houses survive earthquakes without killing them is that
their houses comply with government designed and enforced building codes.
There's a reason the Haitian earthquake killed more people than stronger
earthquakes in other places. Haiti has had "limited government" for decades.
It's easy to paint limited government as a moral ideal, and it's easy to find
prominent examples of government failures. I personally hate bureaucracy as
much as anyone else. The fact remains that a lot of people (often quietly) get
meaningful help from our overall society via democratically elected
governments.
~~~
lionhearted
This has been a really interesting back and forth, just a quick point:
> There's a reason the Haitian earthquake killed more people than stronger
> earthquakes in other places. Haiti has had "limited government" for decades.
That's actually false. Haiti ranks as the 141st economically free country in
the world out of around 200, and 24th out of 29 in the Caribbean. Government
spending is around 20% of GDP, the income tax rates are comparable to the USA,
but the property tax rate is sky high - up to as much as 15%. So no, Haiti
hasn't had "limited government" the way Singapore or Hong Kong has limited
government. They've got quite a bit of government going on, it just happens to
both inept and corrupt.
~~~
MartinCron
Good point. I was thinking of limited in terms of what the government does for
people, but I see that it was a terrible example.
I still like building codes, though :)
~~~
Dove
The earthquake in Haiti was so destructive, in general, because Haiti has been
a land under the heel of grinding poverty for many years. It was the poorest
country in the western hemisphere, a place where children were sold into
slavery to make ends meet, where eating dirt to avoid hunger pains was a
practical enough approach to spawn an industry. And then natural disaster hit.
Building codes have got nothing to do with it. My sister went there a few
weeks ago. I've seen pictures. Some of those houses look to me like they'd
fall over in a stiff rainstorm.
[Edit: The hospital my sister helped build is the subject of the video
featured on this page: <http://heartlineministries.org/EarthquakeNeeds.aspx> .
If you haven't seen Haiti, it's worth looking at the community they served and
the problems they solved.]
------
ajscherer
I agree with maybe half the things the author is saying, but I hate the way he
tries to group them all together into some larger narrative of "the way we
live has become synthetic and corrupted and we need to go back to a more
natural way of life" Appealing to this story is only going to convince people
who already believe it. If he has specific improvements in mind, he should
describe them and make his case for how they will help.
Also, he opens up referring to 1 billion people going hungry, but never
returns to this point. If we embrace techniques that produce less food, what
happens to these people?
~~~
teach
That's probably because the blog linked to is mostly just a cut-and-paste from
the linked Time article
([http://www.time.com/time/health/article/0,8599,1917458-1,00....](http://www.time.com/time/health/article/0,8599,1917458-1,00.html)).
This blog post only quotes a part of page 2 of a five-page article, all of
which is worth reading.
~~~
ajscherer
Thanks for pointing that out, I totally missed the attribution to time.com. I
thought the article was just formatted weird.
I agree the full Time article is better, but it still doesn't address the
billion people it mentions going hungry. Ultimately the agricultural system
that is sustainable determines the population that is sustainable, and I am
concerned that the people who aren't going to be sustained won't have much say
in the matter.
------
senthil_rajasek
If you don't have the time to read the whole article
Quick summary :
\- subsidized cheap food has adverse health effects and health care is
expensive
\- affects the ecological system and is not sustainable
\- animals raised in meat factories have to be fed a lot of antibiotics that
have consequences on human health.
------
goatforce5
Why does a salad cost more than a Big Mac?
<http://www.pcrm.org/magazine/gm07autumn/health_pork.html>
~~~
lotharbot
I first saw the article you linked on a day when I'd made hamburgers and salad
for dinner, and it didn't pass the smell test. I calculated the cost per
person of a whole-dinner-plate salad (made from fresh veggies, including bell
peppers, cucumbers, and tomatoes, with dressing) to be about half the cost per
person of a quarter pound burger. This was right after the Chile quake, which
had doubled the price of some of those veggies.
A McDonalds salad costs more than a Big Mac, not because veggies cost more
than meat, but because _McDonalds salads typically come with meat_
(chicken/bacon) that's as expensive as ground beef, and the salad veggies,
cheese, and dressing are more expensive than a hamburger bun and condiments.
(The prices of non-meat salads are anchored to the prices of the meat-
containing salads.)
I'm all for criticizing food subsidies, but let's do it honestly. Even with
subsidies, ground beef costs about twice as much per pound as the veggies I
commonly buy, and ground beef is at the bottom end of the meat spectrum.
Subsidies don't make a Big Mac cheaper than a salad; price anchoring to
chicken-bacon-ranch salads make McDonalds salads expensive.
~~~
infinite8s
You are forgetting about storage costs. Frozen ground beef, buns and
condiments have a much longer shelf life than the typical salad ingredients.
------
davidedicillo
Do you want cheap and healthy food? Eat organic (cost more) but cook yourself
and grow your on condiments.
Cooking yourself it's a no-brainer, and you know exactly what you are eating.
Also I have on my patio all the herbs and few vegetables (I live in a urban
area in Miami, don't think I have fields behind my condo). When you go to the
store you pay $3 for a couple of springs of sage, when you could buy a sage
plant for $3 and keep using it all year long. And the money you save you can
reinvest it in more healthy (and more expensive) food.
~~~
ErrantX
> Eat organic (cost more)
For the most part there is no real extra benefit for Organic food - assuming
you are not opposed to the idea of pesticides and other treatments (which are
mostly benign anyway).
Personally all I require from my food is that it has been farmed ethically and
preferably produced on smaller scales (i.e. no Bernard Matthews Turkey)
That can save you a lot.
(seconded on the cook for yourself part - it doesn't have to take long and
it's a great way to unwind :D)
~~~
bpyne
"For the most part there is no real extra benefit for Organic food - assuming
you are not opposed to the idea of pesticides and other treatments (which are
mostly benign anyway)."
From a nutritional viewpoint, what you said agrees with what I've read. I
think there's real worth, though, in experimenting with different farming
techniques that don't rely on mass produced chemicals. Reliance on these
chemicals makes me think of the "putting all your eggs in one basket" adage.
Recently I heard an NPR piece about nutrition studies done across decades. One
line that struck me is an apple grown in the early 20th century in the US had
the nutritional value of 3 apples grown today. I'm not sure how the
researchers came to that conclusion. But, if it's true and if a similar ratio
applies to other produce, we have the potential for a serious health issue
coming up.
EDIT: Enclosed the quoted text in actual quotes for readability.
~~~
ErrantX
Yes, you're right there.
As I said in my main post I think the real important thing to work on is
ethical farming - regardless of organic or not.
The other thing is that from a sustainability point of view (in terms of the
farming infrastructure) Organic is very cost/labour intensive compared to
other techniques.
There is probably a good balance; as you say we can reach it by experimenting.
~~~
imajes
The problem with saying organic has no benefit is that i can always say,
"DDX", and refute your argument.
~~~
MichaelGG
DDT has done a ton of good, too, saving millions of lives. From the wiki
article:
"For example, in Sri Lanka, the program reduced cases from about 3 million per
year before spraying to just 29 in 1964. Thereafter the program was halted to
save money, and malaria rebounded to 600,000 cases in 1968 and the first
quarter of 1969"
~~~
imajes
True.
it's pretty awesome at retarding growth of some of the most virulent disease
vectors - but at the same time, it's also a fairly awful carcinogen and I
wouldn't want it sprayed on my lettuce to keep away some pesky caterpillars.
:)
------
istari
Hated this. Distorted statistics mixed with cliched sound bites and fermented
in rhetoric.
For starters, sugar and starch have more calories than lettuce and fruit not
just for each dollar, but also for every pound.
The author took a couple of well known symptoms(run-off! animal cruelty!
resistant bacteria!), tossed it together in a salad bowl, and chucked it in
the general direction of the nebulous "food industry".
He doesn't offer deeper reasons of why the system is how it is, doesn't
suggest methods of fixing the situation, and goes no further than
regurgitating one sided talking points.
------
stcredzero
A lot of it isn't food, but is instead a food-like substance containing some
macronutrients.
------
joubert
Just because something is edible doesn't mean it is food.
~~~
patterned
Especially when it makes you hungry 2 hours later.
------
dhyasama
If unhealthy food is cheaper than healthy food on a per calorie basis and we
eat too many calories, than why don't we just eat fewer calories in total,
comprised of healthy food for the same amount of money?
~~~
roc
One, because the math doesn't work. You can live on a few dollars a day if you
eat unhealthy food. If you buy healthy food with those same dollars, you're
going to see people going hungry again.
Two, because the choices aren't equally available. At least in Detroit (the
only city I'm qualified to share anecdotes about) it's not as if healthy and
unhealthy food choices are across the aisle, or even across the street from
one another.
You simply can't make healthy choices without special scheduling, trips and
thus even higher cost. Can people do it? Sure. Do they? Absolutely. The local
gardens and the Eastern market are the absolute brightest spots in this city.
The problem is that they simply can't afford (in time, energy or money) to
make that choice nearly often enough.
So, to me, the question isn't "why don't people make better choices?" nearly
as much as "why do we, as a society, make it so much harder to make better
choices?".
There's certainly a willpower and personal responsibility component to this
problem, but the larger issue is why our society has stacked the deck against
it.
~~~
starkfist
Another issue which is often ignored in discussions about food and health is
that healthy food doesn't keep well. By that factor alone it's going to be
more limited and more expensive, because it's harder to produce, transport and
store.
------
frankus
It's funny that there are people who won't put ARCO gas in their car, but will
gladly put AM-PM hot dogs in their stomach.
~~~
jrockway
The government pays to fix your stomach problems, but not your car problems.
~~~
mediaman
I don't think that fully explains it. Government paying for your health care
is not the same as if government were to pay for your car repair. The former
will always carry repercussions for you--diabetes is a very difficult
condition to have, even if the bills are paid by Medicare--whereas there is no
real other pain caused by a car repair, other than financial.
Many of those without health care insurance still eat food that is linked to
detrimental health.
------
bajul
The worst is that cheap food usually tastes great... :)
~~~
imajes
not really. that's the msg which is there to trick you that it tastes good.
~~~
yan
How is being tricked into thinking something tastes good and tasting good
different?
~~~
imajes
It's a shame I've gotten downvoted here, and you got upvoted as much. I guess
hackers like the idea of hacking food too.
Here's the deal, though:
You could conceivably add the right chemicals to a pile of offal cooked in a
skin of some kind and it'd taste like a gourmet sausage. You might have in
fact eaten such a thing. It's also common to add flavorings such as "grandma's
special recipe" or that from a dude in a coat from Kentucky. All these things
make us think that something tastes good, which as also commented in this
thread translates in your brain to mean: "hey, this is good for you, have
more!".
But the reality is, what you're eating is almost certainly mechanically
reclaimed meat (where carcasses with so little meat it's almost pointless to
continue cutting are put through a machine which crushes, juices, minces and
turns out a slushy type 'meat' goo). and MRM is the place where salmonella and
other friends live. It's also not really meat: there's plenty of cartilage
(aka 'gristle') and such that is cut so small, you actually ingest it. It
doesn't cook properly, and it's not really good for you to eat- you just pass
it. But if it's carrying salmonella or e-coli... I hope you didn't have
something important to do for the next couple weeks.
Also, the nutrition recovery from such a meal is incredibly limited. You feel
bright and happy for a short period (the msg et al) and then you have a come
down, a craving, and want some more- for that high again. So in fact, what
you're eating is a bunch of mind-altering chemicals ontop of a bed of
processed waste.
Food shouldn't be like that.
Food should melt in your mouth and be an explosion of flavor and something you
genuinely feel excited about. It should be, you have a meal, which provides
you with some instant energy, and also some which is "slow-burning". Brown
grains (rice, some wheats, etc) are great for this. You will feel full and
energised for longer periods.
Cheap base foods are cheap because they are often subsidized (e.g.: corn) or
because their most common consumer is the cattle, pigs, chickens etc that you
want to eat. This has given a false sense of price - people expect all food to
be as cheap. So supermarkets etc will look to find ways to make everything
else like it. So you'll see battery farmed chickens, who are so close together
they often aren't able to stand up; You'll find chickens who are injected with
water to make extra weight. (yeah, that chicken you just bought? probably a
pound of it is water. You just paid chicken price for water).
Good food is good because it provides great nutrition, is often respectful of
the environment and is sustainable.
Everything else is just yet another sign of man's dominance over everything
else.
(BTW, no, i'm not an enviro-hippy, it's just that i refuse to eat crap if i
can help it- why put yourself through that kind of thing when there's so much
better choice out there?)
~~~
chc
This is all irrelevant to your claim that food tasting good is different from
thinking it tastes good. Taste is entirely a mental phenomenon. Maybe food
should melt in your mouth and be an explosion of flavor and what-not — I can't
tell you what food should be. But if something tastes good to you, it tastes
good to you, whether or not it should.
~~~
imajes
OK, so if we want to get technical...
taste cells on the tongue contain chemoreceptors. Those turn chemical signals
into potential - i.e nervous system signals.
Since normal food has normal type chemicals, whereas manufactured, msg type
food has make-it-taste-nice chemicals. Those are two different sets of
chemicals and so, yes. food which tastes good is different from thinking it
tastes good - they activate different chemical pathways and trigger different
receptors.
~~~
infinite8s
There is no difference between "normal type chemicals" and "make-it-taste-
nice" chemicals. Salt is NaCl, whether it's found naturally or added to your
food. Same with sugar (although it's true that fructose doesn't occur
naturally in the same proportion as processed foods), and fats.
------
Tichy
Just wondering, if vegetables were cheaper than corn, would McDonalds invent a
vegetable bun with a small token pancake slipped in with the meat?
~~~
chipsy
They would probably have to find a way to preserve the vegetable buns as well
as the existing bread buns. They're currently loaded with a generous amount of
sugar and miscellaneous preservatives.
------
johnrob
I think we have as much a knowledge problem as a food problem. If cheaper food
has more calories, then for heaven's sake eat less of it! Split a supersized
big mac meal into breakfast/lunch/dinner and you'll hit your daily caloric
target right on the button.
Not saying this is healthy, but there's no reason it should cause obesity if
you're smart about it.
------
neonscribe
For those who are wondering, PEI is Prince Edward Island, a small province in
Eastern Canada.
------
setori88
"We have given up control of our food to people from far away" no most of the
grain grown goes to feeding the animals to be slaughtered - not to third world
countries.
------
c00p3r
The mantra is very simple: Fresh. Organic. Seasonal.
~~~
sipior
Technically, I think that's a dogma, not a mantra. In a way, of course, one of
the greatest triumphs of modern civilisation is that we now have so many folks
who can afford to be picky about where their vegetables come from. I think
that's great! More or less. However, I rather like the notion of getting
certain produce all year-round...there's no talking me out of it, frankly.
~~~
c00p3r
In India or Nepal people are consuming exactly this kind of food, because it
is local which means cheap. Locally grown vegetables is the cheapest food
possible.
The main source of food is local markets, which operates at morning and
evening times. Most of goods are managed to be sold the same day, and delivery
agents adopts to this circles. So, each morning it is fresh.
If you enter into some local tea-shop or kitchen and take a look which kind of
dishes they serve and how it was made and from which ingredients you could
catch the idea.
Of course, most of fast foods are using not fresh or organic sources simply to
get more profit, but if you can cook yourself you can choice whatever you want
on nearest market.
~~~
sipior
No, thank you.
| {
"pile_set_name": "HackerNews"
} |
When Parallel: Pull, Don't Push - signa11
https://nullprogram.com/blog/2020/04/30/
======
fluffything
> If we’re using concurrency, that means multiple threads of execution may be
> mutating the same cell, which requires synchronization — locks, atomics,
> etc. That much synchronization is the death knell of performance.
You can do this without synchronization by exploiting the fact that all
"lanes" of a SIMD vector execute simultaneously, e.g. by doing vector scatters
"to the left" and then "to the right" those writes will not race with each
other, and don't require locks or atomics.
Whether pull or push is faster heavily depends on the algorithm - there is no
silver bullet here. To write efficient parallel algorithms you need to know
how to do both.
The author only appears to know how to do pull (if that's wrong, they didn't
really do push any justice in the article), so from their POV push just
doesn't work.
The article does show something very important though: pull does give you
reasonable performance and is quite easy to reason about and get right. In
contrast, push is quite hard due to the write-write data-races the author
mentions, so I don't blame the author from arriving at their conclusion that
one should always do pull. It might not always be true, but it is definitely a
safe place to start, particularly if you aren't super familiar with the
algorithm yet and don't know how or if push would work well.
For example, many of the convolutions that neural networks use are actually
implemented using the "push" approach because that's more efficient, and they
are quite similar to what the blog post author is doing.
~~~
sitkack
What does push gain you over pull? Why would someone want to use it? Does it
help with inplace updates vs having to duplicate storage?
~~~
fluffything
The problem it solves is temporal cache locality, and whether pull or push is
better often only depends on what is it that you want to keep in cache for the
next operation.
For example, suppose you are doing a 3x3 convolution over a matrix, and you
want to keep the result of the convolution in cache because the next operation
consumes it.
So your cache is pretty much filled with the output matrix, and you only have
space in cache to pull in one row from global memory for reading.
Now consider what happens if you process your input matrix row-by-row, first
with the pull approach and then with the push approach.
With the pull approach you need to load 3 rows from memory into cache before
you can push. No matter how you do this, as you advance through the input, you
are going to be loading some rows twice. For example, suppose that from the
3x3 window, we load the rows in this order: top, bottom, mid (and then push
the convolution to the mid row of the output). In one iteration we load rows
0, 2 (evicts row 0) and 1 (evicts row 2). The next iteration loads 1 (hit) and
3 (evicts 1), and then 2 (reloads 2!). That's bad!
With the push approach, you load one row, and push partial contributions to
the 3 rows of the output matrix. Then you load the next row, and do the same.
The output matrix is always in cache, so those writes always hit, and you
never need to load a row from memory twice.
So in a sense, pull optimizes for bringing the rows of the output into cache
once, which is good if you want to continue working with the input afterwards,
while push optimizes for keeping the output in cache, which is good if you
want to continue working with the output afterwards.
One isn't really better than the other. Which one to choose depends on what is
it that you want to do. Maybe you have one input matrix, and multiple
convolution windows, and want to compute multiple outputs that you don't need
right away. In that case, you want to keep the input matrix in cache, and
compute the rows of the output one by one, evicting them from the cache after
each computation.
~~~
grogers
If you can't keep 3 rows of the input matrix in cache, why would you be able
to keep 3 rows of the output matrix in cache? If the matrices are large enough
that they don't fit in cache, shouldn't they be processed in squareish blocks
recursively to maximize cache reuse (not in rows)?
~~~
fluffything
> If you can't keep 3 rows of the input matrix in cache, why would you be able
> to keep 3 rows of the output matrix in cache? If the matrices are large
> enough that they don't fit in cache, shouldn't they be processed in
> squareish blocks recursively to maximize cache reuse (not in rows)?
Your cache size is fixed by the computer and you can't change it. If you are
doing a convolution, you want "(N+1)xN = cache size" to determine the size of
the "blocks" (NxN matrix, and an 1xN row) of the convolution you process.
Those blocks are also matrices, so by decomposing your matrix into smaller
matrices you don't avoid this problem.
So sure, you could also decompose your problem by doing "(N+3)xN = cache-
size", but that means that the size of the "square blocks" that you can now
properly fit in cache is smaller, and need to decompose your original problem
into more sub-problems to get it done. You could also do: "2Nx2N = cache
size", and just load a whole input block and an output block into cache.
In both cases, you are bringing memory into cache that you only touch once,
yet continue occupying cache space after you don't need them anymore. You
could have used that space to process larger matrices instead, and have to
decompose your problem into fewer blocks.
------
saagarjha
A surprisingly simple way to be able to remove synchronization in certain
cases. It’s mentioned halfway through, but it’s still worth noting: you can
only use this if your operations are commutative, or you don’t mind the lack
of ordering. So it’s good for things like cellular automata when you have
double buffering but less good when you’re touching the input directly.
~~~
pvillano
an important note
------
chrisbennet
Shaders are are a great place to use "pull".
Take an image that is 1024x1024 that you want to display as a 256x256 image.
The simple way to do this is to visit every pixel in the the source and write
it to the display buffer [x'=x/4, y'=y/4]. 3 out 4 of those writes will just
get written over in the display buffer.
A more efficient way is step though display buffer space and calculate where
that pixel came from (reverse transform) [x'=4x, y'=4y]
~~~
heavenlyblue
No, they’re equivalent. Nobody stops you from from stepping through pixels in
multiples of 4 and then writing that to the display buffer.
~~~
ajuc
You can write code both ways but it's much easier to ensure every destination
pixel is handled exactly once when you interate over the destination pixels,
than if you do it iterating over the source pixels. It becomes especially
visible when you add perspective distortion and other complications.
Which is why every graphic api out there does it pull-style.
~~~
heavenlyblue
Drawing triangles on a screen is a push-style operation, isn’t it?
~~~
ajuc
Yes, from the POV of lists of polygons. But from the point view of textures it
is pull (you go through destination pixels and fill them with source pixels
fetched through transformation).
~~~
heavenlyblue
In the same way you could apply the reverse transform to the polygon and get
the coordinates of display pixels in texture coordinates and then do a push
from there.
They are equivalent, as I am trying to say. The only difference is semantics
or whatever makes it easier to think about.
~~~
ajuc
You would have to skip variable number of pixels in source texture according
to the transformation, then calculate the reverse transformation to know where
to push.
Notice how you not only change x * 4 into x/4, you also change x++ into x+=4
for (int x=0; x<width; x++)
for (int y=0; y<height; y++)
destination[x][y] = source[x*4][y*4];
vs
for (int x=0; x<width; x+=4)
for (int y=0; y<height; y+=4)
destination[x/4][y/4] = source[x][y];
If it's as simple as scaling: x+=4 isn't much harder than x++. But for complex
transformations calculation of how many pixels to skip in any particular row
will be slow and complicated. Meanwhile when doing pull you don't need to do
it.
This is the reason every graphical API i know of does these tranformations
iterating over destination pixels.
------
slashvar2701
That's a somewhat misleading wording for a classic rule of parallel algorithm:
a thread is the owner of its computations.
So, each actor own a group of cells and is the only one able to modify them.
This highly simplifies most algorithms, removes a lot of synchronization
issues ...
This is also why in parallel algorithm you should split the outputs rather
than the inputs.
~~~
cogman10
Often, I find that "if you are synchronizing, you are wrong". Doesn't mean you
can't or shouldn't use synchronizing in cases where message passing is
impractical. However, it does mean that it should be a last choice rather than
a first choice. It is simply far too easy to get locks wrong.
------
jillesvangurp
It's also an argument against using queues in some cases. Instead of
publishing data on a queue, you can make consumers of the data pull the data.
Especially if you have a 1 to many relation ship between producer and consumer
of the data that moves the problem of ensuring data is received out of the
infrastructure to the consumer side. Staying up to date is now their problem
and they can ensure they are up to date by simply pulling regularly.
A good example would be the twitter firehose and consuming that. Other
examples would be websites publishing RSS feeds. You can actually combine the
two and push the information that "something changed" which would cause the
receiver to pull. Caching and CDNs make publishing like this cheap and
scalable.
~~~
kqr
Well, what do the consumers pull _from_ , if not a queue? What is an RSS feed
if not a FIFO queue of published pages?
Queues inherently lead to a pull-based module. The producer does not publish
stuff directly to the consumer, but to the queue. The consumers then pull
stuff at their own pace/terms from this queue.
~~~
jillesvangurp
A rest API. An rss feed is a simple http GET that returns a document. You can
cache it, put it in a CDN. There no queues involved here.
~~~
elteto
Can you expand on this? So each worker gets its own REST endpoint to pull
from?
Because if all workers pull from a single endpoint then that endpoint is a
queue, no?
------
w0utert
This is pretty much the analogue of gather vs scatter in GPGPU programming.
It's a well known fact that in GPU programming, in almost any case a gather
approach (threads map to outputs) works better than a scatter approach
(threads map to inputs). The challenge is to transform the algorithm to still
have some data locality for the reads to allow for caching, coalesced reads
into local memory etc, which can be very hard or infeasible.
------
jedberg
Sysadmins have been using this principle for decades.
When running a fleet of machines, it was always better to have them pull their
config rather than push to them. It was far more reliable because you didn’t
have to worry about whether a machine missed a push.
It would just grab its config whenever it came back online.
~~~
jabl
Well yes, but like another comment here on HN points out, it's not universal,
and there are trade-offs, and different systems have chosen differently. E.g.
Monitoring: Mostly push, except prometheus which is pull.
Config management: Mostly pull, except ansible which is push (yes, there is
ansible-pull, but most ansible code out there assumes push).
~~~
eeZah7Ux
In both monitoring and config management, the majority is right.
Monitoring: you don't want to have all your applications and even scripts run
a thread to listen on a dedicated port to allow pulling. It's bad for
security, code complexity, configuration complexity (all those ports) and
reliability (when an application stops the datapoints are lost).
Config management: with a push model you can easily end up having 2 developers
push to the same system (or to systems that should be updated in sequence) in
an unmanaged manner.
~~~
longcommonname
Push based monitoring has its own flaws. Mainly service discovery is coupled
with service health.
Pull based monitoring makes it so you can remove the service and independently
recognize failures.
~~~
eeZah7Ux
You can do the same easily with push.
------
lmeyerov
EDIT: Totally forgot we are giving a webinar tutorial on this on Friday as
part of the Security (Analytics) Jupyterthon. See the Python GPU one-liners
talk @
[https://infosecjupyterthon.com/agenda.html](https://infosecjupyterthon.com/agenda.html)
!
The hardware opts desired here are vector data parallelism, both in bulk data
movement and execution. When you realize that those are the instructions the
device is running, yet are using languages or libraries that hide that, you
end up easily going against the grain.
In contrast, I've been doing a lot of gpu dataframe stuff for the last 6
years, which is basically a specialized and embeddable form of functional
nested data parallel libs from the early 90's . (See NESL and later Haskell
generalizations. In the top 5 for most beautiful papers I've read!)
Writing code this way basically comes down to writing closed form / point-free
solutions as pure maps & scans (~= materialized views) that in turn have
easily automatically optimized multicore + SIMD / GPU implementations
underneath. Put differently again, columnar analytics where you try to only do
pure functions over columns (tbl.col_x + tbl.col_y; tbl.col_z.max()) yet get
all your rich functional language features.
Another poster correctly said to focus on vector gather over vector scatter.
That is the assembly code primitive starting point for this journey. NESL's
nested functional data abstractions were invented precisely for automatically
compiling into this kind of extreme vectorization -- and before GPUs were even
a thing! By sticking to libraries built entirely for that, it's easy to write
data parallel code that can run on say 8 GPUs, with way less brain damage, and
lots of automatic invisible optimizations missed here.
The fun part to me is that parallelizing this kind of cellular automata
simulation maps to a class of parallel compute called 'grid codes' or 'stencil
codes'. There are DSLs with specializing compilers precisely for all sorts of
weird tile shapes - imagine writing a weather simulator. Think autotuning,
synthesis, etc. It's a dangerously niche space. I suspect GPU dataframe libs
can help put them to rest, would be cool to see what extensions it'd take!
~~~
fluffything
Which languages are you using?
In my experience, the main problem with expressing your algorithms in that way
is "What do you do when the primitives you are suing (e.g. maps and scans) do
not produce optimal code?".
For example, suppose that you are doing two scans with two maps, whose
efficient implementation does a single memory traversal with the two scans and
maps fused into a single kernel. If whatever abstraction you are using isn't
able to do this type of kernel fusion, then you easily bump to a 4x larger
memory bandwidth, which might be translated to 4x slower performance.
I don't know of any language or optimizing runtime that's able to do these
transformations and match the performance of the CUDA code one would have
written by hand.
~~~
lmeyerov
I agree in the small and disagree in the large.
Something like single-gpu cudf (think pandas on GPUs) will have that issue
because it is eager/concrete. When going to multi-GPU with dask-cudf or say
BlazingSQL, the deferred symbolic compute means they can add in fusion,
deforestation, etc. That is exactly what happened with Spark's CPU-era
compiler over the years.
In the large.. we have a giant code base for GPU visual graph analytics. We
hand write maybe 1% of the frontend and 1% of the backend. The libs underneath
likewise have handwritten code where it matters, but are also built on the
abstraction stack. So yes, we absolutely handwrite... in the tiny minority of
cases :)
In practice, I care less about the 4x thing here -- rather get more of our
code vectorized and more devs productive on it. In the small, what ends up
mattering (for us) is stuff like making sure they do not do excessive copying,
which is more of an artifact of trying to match Python Pandas's API design
bug-for-bug. Likewise, stuff like testing for performance regressions across
upgrades. We have been building a style guide and tools to ensure this, and
are able to use practices similar to any big non-GPU software project (lint,
CI, etc).
In contrast, a codebase on handwritten CUDA would be incredibly small and
quickly out-dated. We have enough stress in our lives as is :)
~~~
fluffything
That makes perfect sense, and it appears that you always have the option to
handrol something if you discover it as a bottle neck worth fixing.
------
qubex
This reminds me of the oddball comment I read somewhere to the effect that “
_come-from_ , despite all the humour, turns out to be a very sensible
parallel-safe alternative to _go-to_ ”. Can’t remember the exact wording nor
the source, though (unfortunately, because I’ve taken it to heart as very deep
architectural wisdom).
~~~
dehrmann
If it helps, I've heard "return-from" used to describe Aspect-oriented
programming.
------
defenestration
This seems counterintuitive as it is the opposite of the programming ‘rule’:
Tell, don’t ask.
~~~
ajuc
Yes. And it's one of the most important reasons OOP is so bad at
multithreading.
You want "pretty" OOP? Tell Don't Ask.
You want performance and reliable multithreading? Command-Query Separation.
Which is basically the same as functional programming :) Make all the
functions pure except for a few clearly marked ones with side effects.
It's especially visible in gamedev. After a decade of trying to make OOP work
with multithreading people moved on to Entity Component Systems (which is
basically imperative programming with some relational and functional aspects).
It's also why functional programming is getting more popular - because it is
pull by default. It's also why almost all graphical shaders are functional
even if they look like C :) Many inputs one output and no side effects :)
~~~
MaxBarraclough
> graphical shaders are functional even if they look like C :) Many inputs one
> output and no side effects :)
That's not functional programming, it's just pure functions. Shaders make use
of sequential execution of statements, and they don't support recursion.
~~~
alecmg
you are splitting hair on definitions.
It is the pure function nature of functional programming that makes it easy
for concurrent and parallel tasks. Recursion is neither here nor there
~~~
MaxBarraclough
It's not hair-splitting. As I understand it, functional shader languages are
an active area of research. Mainstream shader languages such as HLSL are most
certainly not functional.
~~~
ajuc
Languages are functional if they support higher order functions, recursion,
closures, etc.
Code is functional if it separates side-effects to small, clearly marked parts
of the codebase.
------
ball_of_lint
The author claims that their code is optimized, yet still took 10 days to run.
Unfortunately, they missed a key observation that would have cut their runtime
by 1/4 or more.
When stablizing a sandpile that begins in a symmetric configuration, it can
never enter a non-symmetric configuration. Therefore instead of computing the
whole 10000x10000 pile, you can compute just one 5000x5000 quarter of it, and
add a special case for the internal edges. They should not subtract 4 when
overflowing, but instead subtract 3 (or 2 for the corner) because they will
always trade grains with their mirror image.
Although it's a bit more complex, you can also improve upon that by tracking
not a quarter of the grid, but instead 1/8th by using a triangle. When
tracking a triangle the cells on the main diagonal of the square are not
mirrored, but instead their neighbors are. This may not work quite as well
with their parallelism setup though.
~~~
allover
Very interesting optimisations, but I would not say the author "missed" these
points, since this wasn't the point of their post.
------
loeg
This is kind of silly. The author has reduced the number of mutations per
entry by a factor of up to 4. That's why contention is reduced (less M(O)ESI
eviction of shared lines). It happens to be that this problem is amenable to
that kind of computation but it isn't always the case.
------
polskibus
Seems a bit similar to volcano model in databases, where iterators pull next
tuple.
------
lmilcin
It is much easier to build reliable distributed system when each component
consumes (pulls) input and then publishes results.
This is mainly due to the fact that consumer knows when it is ready to
consume.
------
code-faster
Designs based on pulling are also faster to code than their push equivalents,
especially when parallel or in a client server model.
| {
"pile_set_name": "HackerNews"
} |
A map of America’s migrations using genetic data from 770K saliva samples - freedomben
https://blogs.ancestry.com/cm/what-770000-tubes-of-saliva-reveal-about-america/
======
notspanishflu
That map makes no sense. Hawaii ruled by Caribbeans? California is mostly
Portuguese? No sign of Spaniards? I call it BS.
This is a really poor piece of PR for Ancestry.
------
mirimir
And people worry about Google having too much of their data?
[https://www.wired.com/2015/10/familial-dna-evidence-turns-
in...](https://www.wired.com/2015/10/familial-dna-evidence-turns-innocent-
people-into-crime-suspects/)
[https://www.eff.org/deeplinks/2015/05/how-private-dna-
data-l...](https://www.eff.org/deeplinks/2015/05/how-private-dna-data-led-
idaho-cops-wild-goose-chase-and-linked-innocent-man-20)
~~~
bigbugbag
And rightly so, on top on everything else google has a copy of the whole human
genome and has been in this DNA business for almost 10 years.
23andme, Google Health (discontinued), Calico, Google Genomics, Baseline Study
to name a few out of the long list of why google is scary.
~~~
mirimir
Damn. It's worse.
It was disconcerting to learn that the Sorenson DNA data was public. Just
"anonymized".
~~~
r0muald
Dumb question: what are the possibilities for obtaining the same service
offered by these data-hungry companies without giving up genetic privacy?
------
bigbugbag
That the United Stated are a scary place and that you can't trust ancestry.com
and ancestryDNA. They will take your DNA samples and put it in a big database
to be crunched and analyzed and used for their own marketing. May be used by
current or future government for whatever nefarious purpose, or hacked and
stolen or sold for profit.
~~~
saiya-jin
east-german secret service stasi had this technique for up to 5 millions of
their own citizens, in form of some smellable clothing sealed in jar so that
trained hounds can track you.
~~~
oblio
(rant)
I'm not saying you're wrong and I agree that these companies will use the info
for their personal gain.
But I do have a question for anyone watching: why are Americans so paranoid
about falling under a authoritarian regime? Europeans aren't as afraid of it
and every European country has at one point or another been ruled by a
dictator. The US has never been yet the vast majority of comments by Americans
seems to indicate a deep distrust of their own government, to the point where
half the country would almost get rid of it (Republicans).
There's a very thick line between a slightly abusive and malfunctioning
government and full blown dictatorship. And that line is not that easy to
cross as it may seem if the country's citizens don't want it. Heck, my
country, Romania, who had a very fragile democracy across the years, only
succumbed to dictatorship under massive external pressure - including military
pressure - from the Nazis and the Communists.
If little ol' Romania, with a very uneducated populace yet willing to fight
for its democracy sort of held on for 100 years out of the 160 since its
formation, surely the US would be more resilient, without half its citizens
becoming paranoid?
(end-of-rant)
~~~
marcoperaza
Maybe that vigilance is why the American Republic has been uninterrupted for
over 200 years while freedom and democracy in continental Europe only still
exist because the US won and imposed it.
~~~
scrollaway
> _freedom and democracy in continental Europe only still exist because the US
> won and imposed it._
This is a ridiculous assertion. Back it up with some facts and sources.
~~~
marcoperaza
The Second World War and the Cold War. American military inolvement in Europe
is the only reason why the entire continent wasn't permanently controlled by
either the Nazis or Soviets.
------
liquidise
Clicking the infographics brings me to a signup page? I'm not one to complain
about marketing techniques but jeez is that a poor experience.
~~~
jay-saint
Open the actual image in a new tab it is much larger and actually readable.
[https://blogs.ancestry.com/cm/files/2017/02/MapMigration_c.j...](https://blogs.ancestry.com/cm/files/2017/02/MapMigration_c.jpg)
------
jaimex2
So many tin foil hats. Just don't give them your real details when signing up
and sending the sample?
I did mine because it was a cheap way to get my DNA in a file. You can then
run it against [https://promethease.com/](https://promethease.com/) or
[http://dna.land](http://dna.land) or whatever other tool you like.
~~~
marchenko
That might work as long as none of your relatives use the service and submit
their real details.
~~~
jaimex2
Yeah, there is no defence against that as far as I know?
Like, even if you don't ever do it a relative will still get a close enough
match to put you in the picture.
------
Jaruzel
Direct link to the full-size map for the lazy right-clickers:
[https://blogs.ancestry.com/cm/files/2017/02/MapMigration_c.j...](https://blogs.ancestry.com/cm/files/2017/02/MapMigration_c.jpg)
------
hijp
Why is Hawaii labeled Caribbeans?
------
sl1e
They still have no clue about the northern Alaska Natives.
| {
"pile_set_name": "HackerNews"
} |
Third World Quarterly Publishes “The Case for Colonialism” Leading to Censorship - kushti
https://legalinsurrection.com/2017/09/third-world-quarterly-publishes-the-case-for-colonialism-leading-to-censorship-demands/amp/
======
dEnigma
_" This is an issue of student safety and having people at the institution who
hold views like this does not create a safe campus for everyone."
"This is especially appalling when the author elsewhere in the article takes
the words of multiple decolonial scholars of colour out of context in order to
justify his violence against their respective communities and cultures."_
I haven't read the whole viewpoint article yet, but I'm pretty sure it doesn't
call for violence or threatens campus safety.
------
dEnigma
Website doesn't load on my end, unless I remove the "/amp/" from the URL
edit: Seems to be a problem with my adblocker (uBlock)
| {
"pile_set_name": "HackerNews"
} |
Ask YC: What are the best computer science universities in Canada? - buss
I'm considering a move to Canada sometime in the next year or two, but I plan on going to grad school (I'm on track to finish my BS in CS in the next two years). If I do decide to make the move, what are the best computer science schools, preferably with significant research in security? I like the Vancouver area, but I don't think the University of British Columbia would be a good pick for CS. Opinions?<p>Also, it would be nice, but not necessary, for there to be some amount of startup activity in the area.
======
neilk
In Canada, Waterloo is head and shoulders above the rest for almost anything.
Ian Goldberg teaches there now, so there's at least one world-class security
researcher. I believe there are plenty of startups in the area, but I don't
have first-hand knowledge. I can tell you that a Waterloo degree is looked on
very favorably at Microsoft, Google, and Yahoo.
In Vancouver, your picks are UBC or Simon Fraser University. I don't know
enough about their grad programs to say anything useful, but I know SFU has a
very good undergrad degree, and they really mentor their students to be
entrepreneurs. There is a startup scene in Vancouver but it's a bit
lightweight, with a few notable exceptions.
~~~
neilk
Cryptography, Security and Privacy research group at Waterloo:
<http://crysp.uwaterloo.ca/>
~~~
buss
I had no idea that Waterloo was so into security research. I'm pleasantly
surprised to see that some of the people in the department help develop OTR,
which I've been using for just over a year (it was especially comforting while
I was studying in China). As it stands now, Waterloo is at the top of my list
of places I'll apply for grad school. Thanks for the great recommendation!
~~~
hello_moto
Keep in mind that a lot of UBC graduate students actually come from Waterloo.
While Waterloo (and maybe SFU) have a good/strong CS department, they're
strong in undergraduate.
------
ozzieg
University of Waterloo has always ranked at the top of all CS programs, and is
the most sought after school with respect to university recruiting. Security
is thoroughly covered in the combinatorics department of the math faculty
------
cperciva
Can you be a bit more specific than "security"? Are you interested in
cryptography? Protocol analysis? Formal security proofs? Automatic analysis of
source code to detect bugs (think Coverity)? Also, from a general point of
view, do you consider Computer Science to be a branch of mathematics, or a
field of engineering?
If you like Vancouver, I'd certainly recommend considering SFU -- as a whole,
SFU is smaller than UBC, but the CS departments are almost exactly the same
size.
~~~
buss
By security, I mean cryptography, intrusion detection systems, firewalls, and
web-based attacks (like SQL injection, XSS, and the like). I consider computer
security to be equal parts mathematics and engineering (even the best
cryptosystems can be engineered poorly and introduce weak points).
I'm taking a course (in the mathematics department) on elliptic curve
cryptography, and I am a new member on UF's infosec team. I'll be competing in
this year's iCTF, so that should help give you an idea of what I mean by
security and what I'm interested in researching.
------
hello_moto
The head of CS department has a PhD from MIT with strong Math background and a
fellow AT&T researcher specializing in security.
[http://www.publicaffairs.ubc.ca/ubcreports/2005/05mar03/defe...](http://www.publicaffairs.ubc.ca/ubcreports/2005/05mar03/defence.html)
UBC has won ACM Pacific Northwest Programming Contest for 4 years in a row
beating Stanford, Berkeley and the rest. The other 2 UBC teams usually place
within top 10 (3rd, 4th place)
------
mmp
I'm a CS grad student at the Universite de Montreal. It's probably not what
you're looking for (what's security?), since it's francophone and concentrates
mainly on operations research, but if there's any interest feel free to email
me.
If you do come to Montreal, check <http://montrealtechwatch.com/> for local
startup activity. In any case, it's a great city to live in.
------
stevenr
This guy recently broke down the schools which he thought offered good
introductions to best pratices:
[http://blog.chapmanconsulting.ca/2007/09/27/Reviewing+Canadi...](http://blog.chapmanconsulting.ca/2007/09/27/Reviewing+Canadian+CompSci+Schools+Whos+Teaching+Best+Practices+Part+II.aspx)
------
theorique
Waterloo is probably a good bet. Startup and entrepreneurship history (Maple,
RIM/Blackberry).
Strong in math and applied math also. <http://www.cacr.math.uwaterloo.ca/>
<http://crysp.uwaterloo.ca/>
------
paulgb
Waterloo has a pretty good reputation for CS. The area has an active and
friendly start-up community as well.
------
amichail
Univ of Toronto is the top one overall for grad school though I don't know how
much security research is done there.
------
aarontait
UOIT has a great CS and Software Engineering program. They are also the only
school in Canada to offer a masters in information technology security. UOIT
is also Ontario's newest, and most wired university. <http://uoit.ca>
------
aarontait
CS or Software Engineering at Ontario's newest and most high tech university
is, in my opinion, beyond what Waterloo can deliver. Check out UOIT
<http://uoit.ca/>
~~~
buss
Is there a list of the current research? I can't find anything.
| {
"pile_set_name": "HackerNews"
} |
Startup Ideas - July 2011 - tansey
http://wesleytansey.com/startup-ideas-july-2011/
======
avdempsey
"Does this fit me" has been a dream of mine for awhile. Imagine a world where
everybody's clothes fit. Where clothing can be spontaneously gifted, and
you'll be sure it fits like a glove.
I think he sells himself short here. Women aren't the only people that have to
hunt for a fit. I'm a tall, lanky guy. Most medium shirts are too short, and
my shoulders swim in a large. Like any good scientist, I've trialled and erred
my way to brands that fit (thank you American Apparel medium 50/50s), but this
discovery process is a pain in the ass. Here's the upside, browse through
clothes online, and everything fits, see how it looks on someone with your
shape.
There are some big challenges of course. A brand's sizing will vary from
design to design. I bought a pair of Nike 11s that fit, after trying another
Nike 11 that didn't. Then, how consistent is the sizing after manufacturing,
after washing?
Sigh, somebody start this, and I'd love to help.
~~~
rw
Another one is Clothes Horse. They're trying to solve this problem using
machine learning: <http://www.clotheshor.se/>
~~~
dwhittemore
I love this idea - which is why I co-founded Clothes Horse ;) thanks rw and
gsiener for the comments! Hopefully you'll see us everywhere soon. Hit me up
if you wanna get involved - @dwhittemore on twitter
~~~
paliopolis
was trying to sign up using the form on the website, but cant seem to submit
the form :-(
------
naeem
I'm worried that a lot of these ideas are taking Paul Graham's adage of
solving simple problems a little bit too much to heart. The amount of effort
that would be expanded into these ideas - and various other start-up ideas
nowadays - generally appears to me to be more than it's worth.
I suppose my point is that entrepreneurs are spending too much time solving
mundane, every-day problems which aren't that bad as opposed to looking at the
big picture. I mean, don't get me wrong, I recognize that not everyone is
going to build the next facebook... but if we don't strive for that, doesn't
that eliminate the risk - and ultimately, the charm - of entrepreneurship? Or
am I just bitter?
I'm starting to think it might just be the latter.
~~~
tansey
_> not everyone is going to build the next facebook... but if we don't strive
for that, doesn't that eliminate the risk - and ultimately, the charm - of
entrepreneurship? Or am I just bitter?_
First, I really wouldn't say Facebook was a big idea when it started. It was
about getting Harvard students laid, with some long-shot plans about
connecting the world together gradually over time. I could come up with some
half-baked paths that a few of my ideas may follow to become a huge company,
but of course that is more about luck down the road. I like to focus on ideas
that can solve a problem right now and make money.
Second, risk is not a binary concept. You have an efficient frontier of
risk/reward points and you simply choose your spot along that line. If you
want to start the next Facebook, you might be looking at 1,000 competitors. If
you want to build a Settlers of Catan clone, it's more like 3. There's still
risk, but it's greatly reduced, and with that reduction comes a reduction of
potential return. Feel free to swing for the fences-- but every team needs
guys who can get singles and RBIs, not just home runs.
Happy to hear some examples of ideas that you think have sufficient potential
and charm. :)
~~~
naeem
Good points. I guess I'm not as concerned about entrepreneurs pursuing ideas
which are inherently doomed, as much as entrepreneurs grasping for straws.
I'd like to think the idea I'm working on now has sufficient potential and
charm! :-)
------
dbingham
With regard to the source control for recipes, I've been trying to get an
StackExchange/Reddit sort of thing for recipes that includes some wiki editing
off the ground for a year and a half. Haven't had much success, but working on
version 3 in my spare time now. Maybe with a nicer design (a real logo) and a
core written in Zend as opposed to Cake things will pick up speed more. Here's
the current version: <http://www.fridgetofood.com>
~~~
tansey
Cool idea and nice looking site!
I think the reason why Github is a better model than reddit is because of the
developer-centric model of Github. I use Github because it's how I manage my
own code-- it's first and foremost about me. The social aspect of it is what
makes it go viral when a cool library is pushed, but that's a vitamin, not a
painkiller.
With reddit, it's more of a nail-biting, "gosh I hope people like this" sort
of model. Its focus is on the community and a link is a temporary (usually < 1
day) talking point for people.
I think chefs treasure their recipes and carry them around for life. They know
what they like and they're happy to experiment/learn from others, but it's
mostly about mastering their own skills and creating their own recipes.
~~~
aero142
For Reddit, a person can evaluate the quality of an article very quick and
simple democracy pushes it to the top. Also, once I have read the article, I
usually don't care to ever see it again. With recipes, it takes longer to
evaluate, so the credibility of the person recommending it has more weight.
Also, when I find a recipe I like, I want to keep it for life. "Browse
Emeril's recipes" sounds like a site I want to see. (This may be exactly what
you were saying, not sure)
~~~
dbingham
Well, Fridge to Food attaches a StackExchange style reputation to people (that
carries more weight than reddit karma) and does have a page where you can see
user's recipes (similar to SO). Really it pulls more inspiration from
StackExchange than it does from Reddit.
------
rglover
Love the idea of github for recipes. I think it'd be great to share and build
recipes with others. Actually, the github/social version idea could work for a
lot of stuff in theory (e.g. music, design, etc).
------
abhiyerra
The crowdsourced editing is already done. <http://www.cloudcrowd.com/>
------
jamesgagan
I've got a pickup lines site (<http://worldsbestpickuplines.info>) which would
be easy to adapt or clone to the "Pickup Stories" idea - I think I'll do it.
Thanks for the idea!
~~~
tansey
Awesome! Drop me a line when it's up and running!
~~~
jamesgagan
done! slightly modified from the original idea: <http://hookupstories.net/>
now all it needs some content! any hn'ers got some good hookup/pickup
stories???
------
eli
> Rewarding Jobs: A jobs site that only posts jobs with a more altruistic
> slant...
I think <http://www.idealist.org/> already fills this niche
------
EwanG
Several of these appear to already exist (such as the crowdsourced editing).
Then there is your webcam deal that hits a personal pain point - anywhere
sufficiently exotic will be Mega expensive to get a signal out of. Heck I'd
pay for 50 webcams to be installed throughout Rocky Mountain National Park
myself if it weren't that only a satellite uplink would work for most
locations and a couple hundred a day adds up to real money quick :-)
~~~
tansey
Sure, it would be ideal to have a webcam mounted right on top of Pike's peak,
but I think that's not required. I'm sure somewhere like the Broadmoor hotel
in Colorado Springs has a sufficiently awesome view of the Rocky Mountains,
though. :)
~~~
EwanG
Perhaps. But if I'm more interested in checking our Bear Lake, or the summit
view from Longs Peak (rather than the view of Longs Peak from 30 miles away),
or some of the waterfalls along the Glacier trail, then there isn't anyplace
where I can get that. And if a service ever comes along that DOES offer that I
will be happy to give them a fist full of money because while I can't afford
to work in or near the park, I'd love to be able to take a virtual hike when
I'm stressed out.
~~~
swalkergibson
If you fund my next trip to the Diamond, I will gladly place that webcam for
you :P
~~~
EwanG
But how will I get the pictures back from that webcam on demand and for a
reasonable price? I wouldn't mind paying for both of us to go up there if that
was going to be the lion's share of the cost. But since the trip is maybe 1%
of the cost when you have to use a satellite uplink...
------
johnx123-up
Friend of mine posts ideas in <http://fundprojectswith.me/> There ideas are
easily converted to projects with less dollars (500-1000).
I'm going to post these ideas there now:)
------
aorshan
I'm a huge fan of the Video Creation Contest idea. I think 99 designs is
awesome and I think having a good landing video is just as important, if not
more important(at least in the beginning), as having a good logo.
------
nl
Can someone explain the "Rydex for the global markets" idea? I don't
understand what Rydex does at all.
~~~
tansey
Sure! I just wrote up a lengthy explanation of the idea:
[http://wesleytansey.com/elaborating-on-the-global-rydex-
idea...](http://wesleytansey.com/elaborating-on-the-global-rydex-idea/)
~~~
nl
Thanks - that's very useful
------
inittowinit
User Manual Creator
www.ifixit.com
| {
"pile_set_name": "HackerNews"
} |
Micro-Services and Page Composition Problem - malandrew
http://dejanglozic.com/2014/10/20/micro-services-and-page-composition-problem/
======
AdrianRossouw
Dejan has a really great set of posts about microservices. His blog is highly
recommended.
------
CmonDev
Seems to be quite complicated.
| {
"pile_set_name": "HackerNews"
} |
A practical relational query compiler in 500 lines of code - ot
http://scattered-thoughts.net/blog/2016/10/11/a-practical-relational-query-compiler-in-500-lines/
======
gravypod
I don't like the idea of executing "real" code in a database. An DB's parser
shouldn't allow a query make network connections/IO access.
~~~
makmanalp
Well, in an actual system, this would be sandboxed and stuff.
If this all sounds dubious, consider:
In modern databases these days, most everything already lives in RAM, and CPUs
are getting incredibly fast, meaning most of the time the CPU is sitting there
doing nothing, and the bottleneck is bringing stuff up and down from memory
into the CPU cache(s) and registers. There is orders of magnitude difference
in the speed of accessing stuff from registers vs L1/L2 vs main memory.
Now, usually, for sequential access to memory, the hardware is smart and will
pre-fetch stuff into the cache, causing near-optimal performance. The more
random access to memory you have in your code, the more cache misses, and the
worse the performance. Now, imagine having a function call! This means
swapping the stack, code, variables all in and out. And often this is a
virtual pointer (e.g. MySpecialProcessor->next_tuple()) which is even more
indirection and more cache misses. On top of that, there's branches:
abstracted code has if's and special cases for many situations that we may
already know will never happen in our current query, so why have them? And
branches are also bad for other reasons to do with pipelining in modern CPUs.
Additionally, the code you're using is competing for space in the upper levels
of the memory hierarchy with the data you're trying to process. Meaning if
your code is large, then parts of it need to be paged in and out while you
loop through your data, which again, is sloooooooow.
If this sounds like hyper-optimization, consider that this is the inner-most,
hottest loop in the entire database software, and gets called millions and
billions of times per second. Trust me when I say this _does_ make orders of
magnitude differences overall. For a toy example, check out:
[https://www.naftaliharris.com/blog/2x-speedup-with-one-
line-...](https://www.naftaliharris.com/blog/2x-speedup-with-one-line-of-
code/)
Also we'll see this method being used to some degree in Apache Spark:
[https://databricks.com/blog/2016/05/23/apache-spark-as-a-
com...](https://databricks.com/blog/2016/05/23/apache-spark-as-a-compiler-
joining-a-billion-rows-per-second-on-a-laptop.html)
~~~
gravypod
I'm not saying the performance isn't important in this case, I'm saying it's
an inherently unsafe implementation. I can't think of a single way to secure
the system if the password falls to the wrong hands.
I'd much rather have this done in a query language and have the queries
cached. A compiled query should be very small (maybe only a few instructions)
and if done right could be very quick without any fancy meta evaluation. You'd
also get "some" way to sandbox the system inherently.
The code should only be able to access your data and if you design a language
around that idea your have something infinitely more safe then something that
isn't.
------
threepipeproblm
Thank you very much for this.
| {
"pile_set_name": "HackerNews"
} |
Adobe has released the SWF file format specification - timr
http://www.adobe.com/devnet/swf/
======
wmf
I'll just quote John Gilmore:
"Freeing up specs for things that the community has already reverse-engineered
makes for good publicity, and eliminates a legal EULA issue that Adobe was
likely to lose in court in most countries, but doesn't change anything
substantial."
[http://lists.gnu.org/archive/html/gnash-
dev/2008-05/msg00004...](http://lists.gnu.org/archive/html/gnash-
dev/2008-05/msg00004.html)
------
Hoff
The reception here is somewhat surprising. This release should be welcomed.
Encouraged. Supported.
Open specifications do change the landscape.
For all sides involved. For Adobe, for Adobe's commercial competition and
current and future Adobe partners, and for the Open Source world.
Having both open file format specifications and competing products perceived
to have fewer features or lower quality allows you advantages in product bids.
It's a good place to be for Adobe; it's a low-cost variation of the freemium
model, and one that Adobe has used before.
Postscript and now PDF have survived in the wild for some years now, though
you'll particularly note that the specs for PDF have been a moving target. It
is nearly certain you'll see these newly-released specifications evolve over
time.
Open specifications can inoculate Adobe against anti-competitive legal
proceedings that can arise in various jurisdictions.
It also means that other entities are open to develop both competing and
supporting products and to participate in the community without fear of
exposure to the (il)legalities of reverse-engineering. And from the technical
difficulties and the upgrade- and compatibility-related risks of reverse-
engineering -- available specifications make for better open source, and more-
and cross-compatible products.
And the release means that an organization that is a sufficiently large target
to draw the legal attention of Adobe is not going to get tangled up with Adobe
in the process of actually proving that the EULA was unenforceable in various
courts.
Yes, Adobe likely has underlying competitive motivations that supported the
specification release. They're a commercial entity, and they make money
selling software products. Regardless, Adobe appears to have done a good and
useful thing for the community.
I'll thank Adobe for releasing the SWF specifications.
------
bporterfield
I think that the point is being missed here. Adobe's intentions probably have
little to do with player development and a lot more to do with Silverlight &
Canvas.
Adobe's hope is that opening a platform with 98% penetration will further
increase platform acceptance as the standard for rich media on the web.
Think about it from a developers POV - let's say that you're making a choice
between two rich media platforms on the web - both proprietary, closed systems
that require proprietary plugins and code built on proprietary languages.
Tough choice, eh? Maybe you pick the one with a language you're more familiar
with, or one thats been really hyped recently.
Now consider a slightly different scenario: What if you had to choose between
an open, standardized platform built on an open language with open-source
tools that can create applications that run on the web in any browser or your
mac, windows or linux desktop. How does that compare with the proprietary
choice?
Adobe has consistently moved towards an open solution both for its development
tools and its platform, no doubt in response to competitive threats. Microsoft
is trying to inch that way as well but seems a step behind.
------
izak30
Probably has to do with the recent canvas progress, and John Resig's new
javascript. <\--total speculation
~~~
rtf
This is a somewhat blunt way to put it, but it has to do with "you weren't
paying attention to the Adobe Open Screen announcement on May 1st." SWF, FLV,
and AMF were all freed then and, apart from developers already working with
Flash, it didn't seem to make great waves because of the sudden, low-hype
nature of the announcement. A lot of the comments that day - here and
elsewhere - were confused, cynical remarks along the lines of "I'll believe it
when I see it," when they had _already_ put everything up.
The direction of Flash has been towards greater openness since around the time
Adobe acquired Macromedia in 2005: One year later they released a new major
version(9) with a new VM and a completely refactored API, and along with that
they released an open source Flex 2 compiler.
Adobe would probably open source their own player and ensure it becomes the
"de-facto standard implementation," if it were not licensing-encumbered.
~~~
izak30
Thanks for the clarification. Not that I think it could be done in a few days,
but I still see it as reactionary to developments in other technologies.
| {
"pile_set_name": "HackerNews"
} |
Ask HN: are there any "true" rockstars or celebrities who are programmers? - hoodoof
Wondering if there are any well known music rockstars or hollywood-type celebrities who are also programmers?
======
sc68cal
Joel Thomas Zimmerman aka Deadmau5 does some programming for his Monome.
Link: [http://www.wired.com/video/dj-deadmau5-is-a-gear-
head/203906...](http://www.wired.com/video/dj-deadmau5-is-a-gear-
head/20390630001)
James Zabiela also plays his music on his iPad (Hooks up to Ableton Live)
Demo: <http://www.youtube.com/watch?v=jasVTIHP4mA>
On stage: <http://www.youtube.com/watch?v=n9st7RDn8oc>
------
nostrademons
Jordan Rudess (keyboardist of Dream Theater) has apparently released a couple
of iPhone synthesizer apps, though I dunno whether he actually wrote them
himself or collaborated with an experienced iPhone developer to produce them.
Also, Tom Scholz (frontman of Boston) was a kick-ass MIT-trained mechanical
engineer who's done a lot of hardware hacking, including a line of devices
that give you smooth tube amp tone at non-eardrum-popping volume levels.
------
mechanical_fish
Jeff Robbins:
<http://www.lullabot.com/about/team/jeff-robbins>
I know that lots of other musicians program as well, of course, but not
necessarily ostentatiously.
------
noahth
If you're willing to call Max/MSP/etc work programming (I am but I'm sure some
here will be willing to dispute that), you can add a lot of musicians to the
list, including Jonny Greenwood of Radiohead and a whole bunch of electronic
artists.
------
turbojerry
I'm not sure if Penn Penn Jillette has done any programming, I do know he uses
Linux, wrote for PC/Computing and was in Hackers, so there is a good chance he
has at least written a shell script.
------
zerohp
Todd Rundgren is an actual rock star (with a certified gold LP.)
<http://trconnection.com/trbio.html>
------
mthreat
Wiley Wiggins is an actor - he played the kid "Mitch Kramer" in Richard
Linklater's (excellent) 1993 film _Dazed n' Confused_ -
<http://en.wikipedia.org/wiki/Dazed_and_Confused> He's been in several other
movies including Love & a .45, The Faculty, Waking Life, and Sorry, Thanks.
Now he's working on a kickstarter project, an iPad game called Thunderbeam! -
check it out:
[http://www.kickstarter.com/projects/wileywiggins/thunderbeam...](http://www.kickstarter.com/projects/wileywiggins/thunderbeam-
for-the-ipad)
There's also Ashton Kutcher, who said at TechCrunch disrupt that he had
learned a bit of programming in college, but I'm not sure he would consider
himself a programmer.
| {
"pile_set_name": "HackerNews"
} |
Sexting, your lack of privacy and the iPad: a perfect storm? - apress
http://gravitationalpull.net/wp/?p=1310
======
apress
Post from last year worth thinking about again in light of 7th circuit court
decision that allowed firing of a guy for violating his employer's acceptable
use policy even though he was doing it on his own time.
| {
"pile_set_name": "HackerNews"
} |
Data Import module: please break and tell - nlabs
http://ec2-174-129-137-205.compute-1.amazonaws.com/uploadtest/DataImport.html
I wrote this data import module this morning. I reads csv and xls flat data files, imports them into my db and displays a preview to the user. Please try it out and let me know what breaks.
======
nlabs
I wrote this data import module for my web app this morning. It reads csv and
xls flat file data and imports it into by database and then displays a preview
of the data to the user. If you have time please try it out and let me know
any problems. Thank you.
| {
"pile_set_name": "HackerNews"
} |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.