title
stringlengths
1
200
text
stringlengths
10
100k
url
stringlengths
32
885
authors
stringlengths
2
392
timestamp
stringlengths
19
32
tags
stringlengths
6
263
Pass data or events from a parent component to a child component in both functional and class components.
Pass data or events from a parent component to a child component in both functional and class components. Manish Mandal Follow May 11 · 3 min read I still remember the day when I gave an interview in one of the reputed MNC in India. In the first round, the interviewer was a lady and she has asked me at least 50 questions from React, HTML, and CSS. I gave most of the answers correct and selected for the next round. In the next round, there was a guy who took my interview. First, he asked me to rate my skill out of 5 in React HTML and CSS. I gave 4 to HTML 4 to CSS and 2 to React😅. And you know what happened next he started asking me the question only from React 😁. Ok coming to the point he asked me a simple basic question of React. How to pass an event from a parent component to the child component without using any external library. And I didn’t know that answer 😞. But now I know 🙃. So here is a clear explanation of how to pass data or events from child component to parent component in different ways. First Method — Functional child component to Parent functional component Create a child component and put the below code inside that component. import React from "react"; export default function Child({data, onChildClick}) { return ( <div className="child"> <button onClick={onChildClick}>{data}</button> </div> ); } here we have created a child function and inside that, we are creating a simple button whose text data and onclick event will be coming from the parent component. we have passed data and onChildClick props as an object argument to the child function. 2. Now we will import the child component to our parent component. import Child from './Child' 3. Then inside your parent function create another function to run our desire event and pass the child component to return. import React from "react"; import "./styles.css"; import Child from './Child' export default function Parent() { function clickAlert(){ alert("I am working") } return ( <div className="App"> <Child data="Click here" onChildClick={clickAlert} /> </div> ); } here you can see that we have passed Click here as data and onChildClick event from parent to the child. Now when we click on the button it will run our clickAlert function which will basically alert “I am working”. Here is the live example
https://medium.com/how-to-react/pass-data-or-event-from-a-child-component-to-parent-component-in-both-functional-and-class-ae2f8b7ccda2
['Manish Mandal']
2020-08-08 07:15:42.575000+00:00
['React Native', 'Components', 'React', 'Reactjs']
Practical Ways For Performers to Make Money
Practical Ways For Performers to Make Money Just because your show has closed, it doesn’t mean the bills stop coming in Photo by David Hofmann on Unsplash It can only be read as an understatement if I were to say that we live in difficult or depressingly unique times. It seems only moments ago that I was underneath a cruise ship stage in full costume, joking with a technician while I waited for my cue. After ten minutes, the technician would be given the all clear, which would be my signal to duck my head and shuffle onto the waiting lift. I would stand there and complain to anyone that would listen about how much my shoulders hurt while standing in such an awkward position. Could anything be worse than this? Things Became Worse I’m a teacher now, and I wasn’t even forced into the job. I quit my cruise ship job while cruise ships were still sailing, and even considered going back once or twice. But now, everyone I ever performed with is out of the job, and many of them are wondering what to do next. Their plan was to perform aboard cruise ship shows until their agent finally took them seriously and booked them speaking gigs on TV shows. This would then naturally transition into movies and eventually, fame and fortune. These days, only those that are firmly established and have already made their fortune are able to ride out the coma that the entertainment industry has found itself in. Everyone else is going to need to completely change industries until their chosen profession wakes back up. But what choices do you have? Photo by Kat Stokes on Unsplash Writing More than anyone else, a performer has the necessary life experience to write and self-publish a compelling book. By enlisting the help of friends who are talented at proof-reading and editing, anyone can get started as a writer. All that’s needed is a strong recollection of what you went through in getting onto the stage, and what your life was like once you found yourself there. There are millions of people all around the world who dream of a life on-stage, and have now found themselves bored and in need of something compelling to read. It’s your responsibility to provide that for them in the form of an exciting memoir! Talk about how you trained for the role, and the horrible audition process you went through. Discuss the times that you failed, and how these emotions defined the person you became. But don’t leave out all the times you felt on top of the world, what it felt like when the jokes landed, or when the audience gave you a standing ovation. Leave nothing out, and you’ll find yourself with a best seller on your hands. Oh, and let me know when the book is released, because I’m also in need of something new and wonderful to read. Photo by Yura Fresh on Unsplash App Development Apple just made news with the unveiling of their App Store Small Business Program. A knee-jerk reaction from their lawsuit with Epic Games, Apple is rewarding anyone who has an app on their store that makes under a million dollars a year. The program allows any developer who isn’t AAA to pay only a 15% cut of their profits to Apple, instead of the usual 30%. For the vast majority of developers, this is extremely good news because a 15% increase in earnings can make a massive difference to their profitability. Although Apple will never admit it, this development would probably never have happened if it weren’t for Epic Games suing Apple over what it believes is anti-trust behaviour. Epic Games believes that competing app stores should be allowed on Apple devices, because they really don’t like having to pay 30% of their enormous micro-transaction profits to the tech giant. This new program was obviously designed to leave out developers like Epic Games, who make substantially more than a million dollars a year from their app. (These are the developers of Fortnight). But what does this mean for you? It means that it’s time to develop an app. There must have been a time in your life when you noticed that an app would solve a problem you had, but none existed. Think back to these times, and when you’ve landed on the perfect idea, start learning how to get it done. Courses can be found everywhere, including sites such as Code Academy, Udemy, The Open Code, and many others. This is the time to educate yourself and transition to a future-proof skill that will keep you paid throughout any future pandemic, injury, age, or grumpy casting director. The pain of the learning curve now may lead to piles of cash a year from now. Photo by John Schnobrich on Unsplash Online Courses This avenue is a no brainer for performers, because we already have the ideal camera personality and training, and a lot of skills that people want to learn. Whether you’re a singer, actor, dancer, puppeteer, or anything else; now is the time to record an online course. Put on some make-up, set up some good lighting, and record a series of videos that train the next generation of performers in the skills that landed you a career. Teach them all the little tricks and techniques that caught the eye of casting directors, and kept you employed. Sites like Udemy are a great place to host your courses and get paid for your art once again. The best part is, most existing courses are low quality garbage, so all you need to stand out is competence and effort! And if you were a performer for a living, then you have both of those things in spades. Most people teaching performance have never performed for money, so you automatically have an enormous advantage and should have no trouble finding your customers. So go ahead and start planning your course right away, there are millions of bored house-bound dreamers who are waiting for someone like you to fill them with hope, and teach them how it’s done. So Get Started! It truly sucks that you’re out of the job, but that doesn’t have to mean that this is the end. Now is the time to side-step into something that’s going to keep you fulfilled, and pay your rent. Write a book about your experiences, while also recording Udemy classes with your phone and a couple of high quality lights. At the same time, take some coding classes at night to future-proof your skill set. Not many app developers are performers, so it may take a performer to think of the next great idea that current tech geniuses would never think of. It’s up to you to keep yourself going until performing arts is resuscitated and you can get back on the stage. So get to work. *No links in this article are affiliated or compensatory in any way. They’re just honest recommendations.
https://medium.com/money-clip/practical-ways-for-performers-to-make-money-1b91f51fb5ef
['Jordan Fraser']
2020-11-20 09:34:48.342000+00:00
['Finance', 'Acting', 'Money', 'Performance', 'Entrepreneurship']
Kalman Filter in a Nutshell
Drop a Tennis Ball So let’s get started with our example. I am dropping a tennis ball from a helicopter (or maybe my airship), I would like to predict its position in the air until it hits the ground. Tennis Ball Dropping (Image by Author) You would think this is a very simple problem, right? After all, we’ve got our equations of motion: Equations of Motion Where g is acceleration due to gravity (9.81 meters per second squared). Well, unfortunately the real world is not that simple. If you are dropping a lead ball I suppose the equations of motion are all you need, but for something light like a tennis ball, it’s trajectory might be affected by all sorts of things. For example, there might be a really strong wind blowing. Wind Blowing (Image by Author) Or a really hungry alligator might be near by… Hungry Alligator (Image by Author) The real world is complicated. So, if your prediction solely comes from the equations, it might be really off by the time the ball reaches the ground. Let’s take a closer look.
https://towardsdatascience.com/kalman-filter-in-a-nutshell-e66154a06862
['Shuo Wang']
2020-12-06 04:43:50.341000+00:00
['Python', 'Kalman Filter', 'Data Science', 'Statistics', 'Programming']
Fragmented Cities/Towns in Time: (Inner-city) London, Leicestershire and Aberystwyth
The stereotypes and poster children of London are The Queen, Big Ben and red post boxes. But what if I told you that there is another side of London. Away from the glitz and glamour of the capital’s centre lies dark alleyways and numerous corner shops. And somewhere amongst all of that is where I grew up. Somewhere in inner-city London Dog sh*t and sitting pigeons. Pavements and lumpy roads. Underlying all this is the minor details. The worn roads and the weeds overgrown. The industrial landscape infested with concrete homes. The smell of weed and cigarette smoke. The small patches of grass, struggling to stay alive, underneath a polluted sky. Photo Credit: Shakeb Tawheed for Pexels. Moving away from home at varies stages in my life, I was excited to breathe in the fresh air of opportunities. Little did I know that I would miss my home as quickly as I did.
https://medium.com/show-your-city/fragmented-cities-towns-in-time-inner-city-london-leicestershire-and-aberystwyth-d602dd5463d
['Lexus Ndiwe']
2020-12-16 17:05:11.783000+00:00
['Memories', 'Home', 'Cities', 'Travel', 'Writing']
My 4 Year Old is Now a Villain
My 4 Year Old is Now a Villain I think the virus is making her want to be Harley Quinn “Harley Quinn/Suicide Squad”-IMDB poster The pandemic is scaring the bejeezus out of us adults and our children aren’t immune to the stress caused by that fear. Whether they directly fear it themselves or are just feeling our reactions, their emotions are completely understandable. Some kids get depressed, act out, can’t sleep, or are anxious. And then there is my child — who has decided to become a villain. The YouTube catalyst It all started with her watching the “Soty” family, short for “Shot of the Yeagers,” on YouTube. In her eyes, they are the most energetic, fun family — for example, they have the propensity to do things like go to playgrounds to play “lava monster” together. In addition to this activity, they like to pretend like they are fighting “villains.” I think this was her first exposure to the concept of fictional bad guys. This idea of villains really got her attention. So, it’s no wonder that, when she saw an ad for the “Joker,” she kept saying: I.Have.To.See.That.Movie. Now, she is 4 years old. I did not like the idea of her watching anything that heavy. However, I also did not want to stonewall any interest of hers, I redirected her instead to the “Ninja Kidz” YouTube videos. These feature kids dressed as super-villains interactive, such as mini-Joker fighting mini-Batman. And, in the first video I showed her, my daughter was introduced to Harley Quinn. Screenshot: Ninja Kidz on Youtube It was love at first sight. She talked a mile a minute after seeing this video and asked me a whole bunch of questions. Who is Harley Quinn? Why does she do what she does? I have to admit, I’m not up on my DC or Marvel comics knowledge. I asked her 19 year old brother to help her understand this interesting new character. Bubby to the rescue He proceeded to explain the backstory of Harley Quinn; as a long-time comic aficionado, he was primed with all sorts of expertise. This interest started for him around four years old, as well. I remember him dressing up as Spider-Man and I would be Mary Jane. He would pretend to save me over and over again. He had also been Joker for one Halloween as a child; he really enjoyed the newest movie version of the villain. Thus, he was very willing to impart his brotherly wisdom to his little sister. DC Universe Harley Quinn — IMDB He pulled up some more videos featuring Harley Quinn. They were animated and quite campy. Nothing that would scare his sister. She was enthralled and ready for more. She listened intently to her Bubby explain Harley Quinn’s backstory and what made this villain tick. If you aren’t familiar with this character, she is basically Joker’s girlfriend. She was once his psychiatrist while he was in an asylum. Her relationship with him ends up making her go insane as well, and she ends up becoming a fellow villain. Harley Quinn is quite sassy and basically just naughty. She’s strong and quick, and fights well. But more than that, she’s carefree. She doesn’t act afraid, even when her life is clearly in danger. She just laughs danger off with style. Suicide Squad Poster IMDB I mean, what better attitude to have when your life is now defined by a pandemic. Fantasy vs. our reality Because my daughter knows what is going on. I’ve had to be upfront with her about why I was using hand sanitizer on her when we were going to the grocery store weeks ago. I have to explain when she asks why we can’t go to the park. Or when she asks why she couldn’t see her best friend or go do gymnastics. I explained it to her simply, saying there is a virus making people sick and we had to stay home to stay safe. We also had to wear masks and use magic hand sanitizer to stay safe if we went out. But home is safe? Right, Mommy? She asked for assurance many times. And I’d say yes, sad I had to assure her at all. Fantasy helps her cope “People like us don’t get to do normal.” — Harley Quinn, Suicide Squad. The fact is, life isn’t normal right now. In a 4-year-old’s life, not being able to go to Walmart to get a Barbie doll is just as life-altering as my husband teaching all his college literature classes now at home. My having to explain I was watching the President to learn how we could keep her safe just isn’t normal. And, it makes sense to me that becoming a sassy, carefree, badass villain is a great way to cope. After watching various videos online, she asked me if we could watch grown-up versions with these super-villains, like with her interest in the Joker movie. She’s pretty good at self-regulating when she’s scared of something and will ask to turn something off when she’s reached her limit, or will go off to entertain herself. She was still stuck on the idea of seeing the “Joker.” I still wasn’t sure if that was appropriate for her, so we put on “The Dark Knight.” This had been one of her older brother’s favorites. He and my daughter watched it together. She sat in her “ball pit,” a playpen full of plastic balls, and he sat on the couch. I took the opportunity to clean the house. As I made my way back into the living room, I saw that she was captivated by the movie. This surprised me; I had thought the pace of the movie and aesthetics would either bore or frighten her. There were moments when she hid under a blanket or fiddled with toys; otherwise she was really into it. After this, she was ready for more, but this time she made it clear she wanted a grown-up movie about Harley Quinn. I saw that Birds of Prey, featuring Harley Quinn, had come out on demand and I put it on for her. I figured that she’d either get bored or it would engage her like the Batman movie. Birds of Prey Poster IMDB Again, she watched it with rapt attention. She’s the type of child who bawled through out Trolls”: World Tour” because was afraid Poppy wouldn’t go back home. I was surprised these comic-based movies featuring adults weren’t, as she likes to say, “freaking her out.” The emotions displayed on Trolls were more realistic. They don’t included being afraid, not being safe at home, and being apart from loved ones: those are just too real. The emotions expressed by super villains and heroes are powerful, but they involve a sense of control. The fight scenes are campy and without nuance. Yes, there are scary moments, but the villains don’t demonstrate any fear. Harley Quinn, in particular, faces danger with much aplomb and humor. More, more Harley Quinn We went on to buy my daughter a Harley Quinn action figure and a costume. It might be an unlikely interest for a 4 year old. But, I think exploring “villainous” expressions of emotions in such an abnormal time can be healthy. Toy, ebay As an adult, I can’t be a villain. I don’t want to be. I want to be her safe haven. I like that she has a way to process her anger and fear in a safe way. I can’t always keep her safe, unfortunately. But I can help her feel that way. Harley Quinn is useful for that right now. We probably won’t go back to normal, or maybe even a new normal, for awhile. And, I keep trying to find ways to engage her while we stay home. I want to help her cope while she doesn’t have the normality of friends and external activities. This interest in villains serves more than one purpose. Entertainment and emotional release I can’t say that without the pandemic she would want to be a villain. I’ve read that playing super heroes is normal around her age, probably for much of the same reason. It’s a way to feel empowered at a particularly vulnerable age. Who wouldn’t want to feel that way, at least for awhile? Fantasy can be a great coping mechanism. As long as I can keep telling her home is safe — and believing it myself — I take solace that her fears can be expressed through fantasy. So, I will readjust her costume, tighten her pony tails, and send her off to play. My little villain, Harley Quinn, going on to fight another day.
https://medium.com/pop-off/my-4-year-old-is-now-a-villain-8861262cd095
['Melissa Miles Mccarter']
2020-05-06 01:32:39.895000+00:00
['Movies', 'Family', 'Parenting', 'Pop Culture', 'Coronavirus']
7 Traits of Successful Developers
1. Coding Is Not the First Priority Generally, most programmers have this habit of rushing towards coding once they get a problem. When I started my professional career, I was always excited to code. Whenever I used to get a task, I immediately rushed to my editor and started coding. For this reason, most of the time, my code was buggy. I wasn’t writing the wrong code but writing it without aptly analyzing the requirements first. One of the most important aspects of programming is not just how you add the code to an application but how you make sure that it won’t break the existing functionality. One of my senior developers once said, “Programming is an art. Coding is merely 20-30% part of it, and the rest is analysis.” Even a mediocre programmer can code if they know the answers to what and where. If you give 80% of your time to the proper analysis of the problem, then you will find it easier to code in your remaining time.
https://medium.com/better-programming/7-traits-of-successful-developers-593701fa127c
['Shubham Pathania']
2020-11-05 16:39:37.949000+00:00
['Programming', 'Coding', 'Software Development', 'Learning To Code', 'Software Engineering']
How to deploy analytics workloads
“It works on my machine”. That’s great. But now how do you make sure it runs in production, repeatedly and reliably? Here we share our lessons learned from deploying many analytics solutions at clients. By the way, what do we mean with “Analytics workloads”? It’s any workload where you send data to an algorithm and you create some kind of insight. That can be an ML algorithm, that can be a data cleaning job, data integration, NLP processing, … Any piece of the data pipeline really. Deployments are the most important thing you do A typical development lifecycle looks like this: From coffee to code to insight In this cycle, deploying your code is the most important thing you do, because it’s only then that a client can get access to your work. Everything that goes before that is Work In Progress. Everything that comes after that is Business Value (hopefully). This is often forgotten in data analytics projects. A lot of time is spent on improving the model, ingesting more data, building more features. But as long as you don’t bring your insights to your clients, you are not delivering any value. Well, how often should you deploy then? We always promote the notion of doing 10 deploys per day (based on the great book The Phoenix Project). That means that your data team pushes 10 new valuable things to downstream consumers every single day. This is unattainable for a lot of companies that still do quarterly releases or monthly releases. If that is the case, try to bring it down to weekly releases, or maybe even daily releases. You will discover roadblocks along the way. Removing those roadblocks will make your team more efficient and will allow you to deliver results to customers faster. And it will also to learn faster from the feedback of customers and you can adjust course quicker. What are your options? There are several ways that we see analytics workloads being deployed, and we would like to evaluate them on two axes: Effectiveness: How good are your deployments? Are there limitations in what you can deploy? How often can I do deploys?Are the deployments high quality? Can they be stable / easy to monitor? Are there limitations in what you can deploy? How often can I do deploys?Are the deployments high quality? Can they be stable / easy to monitor? Feasibility: How easy is it to get started with this? Do you have to learn new technologies? Will you have to spend a lot of time building the deployment pipeline? Can I have one deployment mechanism for multiple processing frameworks? If you plot the different deployment options on a 2x2 matrix, along these axes, you get the following image: Sorry, every consultant has to make a 2x2 matrix This is of course not 100% correct or the complete picture. But these are the deployment mechanisms we’ve seen a lot at clients and we think this framework helps in making the right decisions for which deployment option is right for your use case. Let’s go over them one by one: Deploys to Pet VMs (Low feasibility / low effectiveness) Use when: Never. Avoid at all cost. Sometimes it’s impossible to avoid. A Pet VM is a machine you name and you really care about. It’s precious and you manually make sure it stays alive. The good thing about Pet VMs is that it’s probably a technology you know and sometimes the only option. The downsides are that the yearly license cost of a Pet is expensive, maintaining a Pet takes a lot of your time, deployments are manual and error-prone and it’s hard to recover when it goes south. Run on Laptop and Copy (High feasibility / low effectiveness) Use when: MVPs and Demos You do all calculations locally and you upload the resulting model, calculations, insights, … to a central server. It’s relatively easy to do and you have full control of your own tooling. But, it’s hard to scale, limited by the capacity of your laptop, and of course it comes with its security issues. It is also very error prone. Run on Notebook Sandbox (High feasibility / low effectiveness) Use when: MVPs and Demos What is it? You open a cloud notebook with connection to production data. You build your data pipelines right then and there and then you schedule your notebook. Finally, you cry while you do support on this thing. This works because it is relatively easy to do, a notebook in the cloud can scale, and multiple people can work on the same notebook. But, having multiple people on the same notebook also brings some chaos in your system. There is no version control, poor error handling and poor testing. Often this leads to infinite spaghetti code, no modularity, and limited visibility on what is in production. => This is a way of running production systems that we see way too much and is actively being promoted by some vendors. In software engineering, that’s the equivalent of manually updating the PHP files of a live website. Great for a hobby project. No serious company would work this way. Automated notebooks (Low feasibility / High effectiveness) Use when: Mature teams with strong devops skills When we challenge the notebook approach, people often push back and point to systems like Netflix where they automated the entire notebook experience, structured the code, implemented a scheduler, have a logging solution and so forth… Notebook architecture at Netflix As you can see it sounds easy but to do it well you still need to build a lot. They recommend you to build an actual application when your notebook is too big. Then, they install your library in the notebook environment and from the notebook you can then just call your library main function/functions and schedule those. What is also very interesting is the fact that they store notebook runs as immutable traces of your application, so you can always check the notebook of a certain run for errors and launch it from there to debug. All in all you can see that making a proper scheduled notebook environment requires a lot of work and engineering. And what works for Netflix might not work for you or be complete overkill. Docker and Kubernetes (Low feasibility / High effectiveness) Use when: Mature teams with strong devops skills Kubernetes is a container platform that allows you to build platforms at scale. It’s born in 2014 out of Google and it’s a great building block for your data platform. On top, it has a large and growing ecosystem. Most software integrates or is integrating with kubernetes. It probably is the platform of the future. And you have fast startup for your applications. The downside is that the future is not there yet. It’s a great building block but deep knowledge of kubernetes is needed to build a data platform. And there are not a lot of people have this knowledge currently. Containers or Jars + PaaS API (High feasibility / High effectiveness) Use when: Adopt for most data teams In this approach you basically take what the cloud vendors offer you and you rely on that as much as possible. Examples are Google Cloud Dataflow, Amazon EMR, Azure Batch, … This approach works because a lot of the complex work is done for you. These services are stable and can be used at scale. And they often come with metrics and monitoring integrated. The downsides are that not everything comes out of the box. You still have to do a lot of work yourself, and you need to add a lot of glue to get going. There is always a risk of vendor lock-in if you rely too much on their tooling. And autoscaling and alerting is something you need to often build yourself. Use frameworks (High feasibility / High effectiveness) Use when: Adopt for most data teams You’re not the first one in this situation. Deployment patterns emerge and frameworks help you automate the production-grade deployment of code. Tooling like Netlify and Serverless.com are examples of this approach. We have recently launched Datafy, a data engineering framework for building, deploying and monitoring analytics workloads at scale. Screenshot of Datafy The good thing about these frameworks is that most of the complex work is done for you. You follow industry best practices. And monitoring and scaling comes out of the box. It makes sure you’re up and running in no time. The downside of a framework is that it is always tailored to specific needs so you are always constrained to what the framework offers. So it’s a matter of choosing the right framework for you. How does this work at Datafy? Deployments are the most important thing you do. That’s why we make it super easy and quick to do at least 10 deploys per day, through the use of the CLI: Create a new project datafy project new — name analyticspipeline --template project/python This will set up a code structure, create a data pipeline with a sample job in it plus a unit test, add a Dockerfile, configure a Python virtualenv and basically do all the scaffolding for you. creating an analyticspipeline is a single command line As a result, you get a project setup with all the scaffolding done Build the project That’s also a single command-line: datafy project build Building a project What this will do is wrap your code in a docker container, and push that container to a Container Registry in your cloud, in this case ECR. Deploy your project That’s as you guessed it, another single command-line: datafy project deploy --env dev --wait And about 2 min later, you’re done. Your analytics pipeline is live. So what? Why is this important again? Whether you use Datafy, take something else off the shelves, or build your own system, it is important to be able to deploy 10x per day. Organisations who do this, are able to deliver results to customers faster, have higher ROI on their data investments and, most importantly, learn from feedback of their customers. This blog a is a written version of a webinar we hosted earlier which you can rewatch on youtube here:
https://medium.com/datamindedbe/how-to-deploy-analytics-workloads-563279bc9694
['Kris Peeters']
2020-09-11 11:53:53.793000+00:00
['Data Engineering', 'Data Analytics', 'Data', 'Dataops']
Explainable AI in practice
Explainable AI in practice How committing to transparency made us deliver better AI products Photo by Kumpan Electric on Unsplash Introduction There has been quite a bit of debate around black box AI models being applied to real-world problems. Common AI models have become so large and complex, that even the developers and product managers building the thing don’t exactly know what kind of decisions it will make. This has resulted in all sorts of unwanted outcomes and potential severe negative consequences for society. These have been evidenced by automated decisions with a racist or sexist bias, such as Apple’s infamous credit card launch in late 2019. Or other dangerous consequences, such as a Tesla misinterpreting a 35 miles per hour speed limit sign after it had been altered with a small piece of tape. Even with these risks, the vast majority of people still believe in the power and potential of AI. This article looks at three real-life cases in which harmful consequences of black-box AI were mitigated. It closes off with a 6-step transparency guideline that we have implemented in our own development process at Slimmer AI to ensure we avoid the pitfalls of back box models. Explainable AI There are techniques and best practices to open up AI models and peek under the hood. Sometimes this is done by selecting a simpler so-called white box model that lets itself be explained more readily. Other times, this entails using novel techniques or grabbing another AI model with the sole task of explaining the first one. The subfield of AI that is concerned with improving model transparency is called explainable AI (or XAI for short). As is perhaps evident by the name, XAI helps human beings understand why the machine reached a particular decision. If humans need to act on decisions made by the system, it’s often very important for these outcomes to be explainable. Within explainable AI, there is a noteworthy distinction between global interpretability — which enables you to understand how the model generally responds in any situation — and local explanations — which show why a decision was made in one specific case. Global interpretability is very useful for developers and business operators because it improves debugging, understanding and adherence to regulatory compliance. Local explanations are also important to end users who want to know how an algorithmic decision came to its conclusion in their specific case and whether it used fair criteria in the process. Do we have time for explainable AI? In a typical AI development workflow, putting in the effort to open up a model might oftentimes feel like time you cannot afford. Machine learning engineers have done their data analyses, feature engineering, model selection and (hyper) parameter tuning in several cycles and finally landed on a model that gives satisfactory, may you even say impressive, results. Product managers have become excited about the model’s performance and its implications for business outcomes. Everyone is on fire to get this thing into production. Why pause now and take the time off to play around with the outcomes? Because, in my five years as a machine learning engineer at Slimmer AI, a company with more than 10 years of experience building over dozens of AI solutions, I have seen quite a few cases where explainable AI saved the day. Three real-life examples Case 1: Don’t copy that One of the AI products in the Slimmer AI Science team classifies scientific documents based on relevant pharmaceutical information. When development started a few years ago, we were all excited that the first results seemed very promising. With both very high precision and recall, we were able to pinpoint which documents mattered. To our customer however, an automated decision wasn’t enough. Their workflow — and the pharmaceutical industry itself — is heavily regulated. They needed to know why a document was classified in one category and not another. Initially, we used LIME (paper, code) to get an approximation of which words in the preprocessed text had contributed most to the results. LIME fits a simple white box model on the feature space directly surrounding a data sample. This way, it learns what subtle differences in input values cause one prediction over another and hence, which features — in this case, words — are most important. An example of applying LIME Text Explainer on a scientific document. Highlighted words have contributed the most towards the eventual classification of the paper. Photo by author. To everyone’s surprise, one of the most important “words” that popped up in several texts was the number 169 at the end of the last sentence. We were baffled by this random looking number and checked the original text for clues. It turned out that texts from one specific source almost always included a copyright mark in html tags at the end of the text. After preprocessing, only the number 169 remained. Typically, texts from this source had a higher likelihood of belonging to one specific category, hence the model had picked up on this by using the copyright reference to discern between categories. While it might be a good idea to include a feature that specifies the source of the text, our model would not have been robust if we had put it into production in its current form. All it would take is for this one data source to remove the copyright symbol, or for another source to add it to their texts as well, and our model’s predictions would be incorrect. We improved our preprocessing by properly removing whole html tags, increasing our confidence in the outcomes of this new model in production. Case 2: Do you see what I see? In a pan-European medical partnership we were the designated party to automatically detect skin infections surrounding the driveline tube of patients with a ventricular assist device. The final AI product was an app through which patients could check the wound surrounding their driveline. In case of moderate or severe infection, immediate contact with a physician would be warranted. In the initial dataset, photos of the driveline entering a patient’s body were categorized by human physicians into one of four categories ranging from no to severe skin infection. Because the physicians sometimes disagreed, it wasn’t possible to obtain a perfect score. Even more, the photos were often cluttered with tissues and bandages and the dataset was relatively small. With such ambiguous data, it was even more important to make sure the model made sensible decisions. While deep neural networks often get a bad reputation for being opaque black boxes, a technique called Grad-CAM (paper, code) enables you to visualize what the network is paying attention to when making its decision. For many use cases, this is a helpful technique to determine whether or not you should trust a prediction at face value. Grad-CAM showed that even though our model often made very sensible predictions, it occasionally based its decisions not on the wound surrounding the driveline, but solely on the driveline itself! Presumably the small size of the dataset and the prominent appearance of the driveline had given the model the false impression that the orientation of the driveline contained an important clue. Left image shows the driveline tube as the main area of interest. Right image shows the neural network’s interest in the wound itself. Photo by author. This insight prompted us to automatically filter out the driveline from the photos before presenting them to our model. We tested and compared several masking and filling techniques, each time using Grad-CAM to analyze the impact. Eventually, we were able to settle on a specific filling technique that made the model’s predictions a lot more robust. Case 3: Can’t touch this In late 2018, we partnered with an institution to help identify customers at risk of getting into severe debt. The institution intended to reach out to these persons early on to get them on a customized payment plan before their finances would worsen even further. Their database consisted of a multiyear fine and transaction history of millions of customers. They also had access to another database from a different institution, containing additional personal information about their customers. The domain experts involved intuitively felt that this second database would provide valuable information. However, due to privacy concerns, the organization was wary to combine these two databases unless this proved to be significantly beneficial to the cause. Photo by Jonathan Farber on Unsplash We trained two models on the data in the first database: a simple logistic regression and a gradient boosted trees model (XGBoost), both using a large collection of engineered features. XGBoost easily outperformed logistic regression, giving us confidence that a more complex AI model was justified. Next, we tested if we could make these predictions even better by adding the second database. To everyone’s surprise, the resulting model did not lead to any better results. We therefore computed the feature importances of both models. This showed which input characteristics had the largest impact on predicting who would end up with severe debts. For the model using only the first database, one particular customer fine in combination with several payment behaviors proved to be most important. For the model using both databases, it turned out that one specific characteristic from the new database was by far the highest contributing feature. This meant that the second database indeed contained information that significantly contributed to making a correct prediction, just as the domain experts had suspected. But the same information was already captured in the model using only the first database, albeit through a more complex combination of several manually engineered features. This led to a sigh of relief, as the client we had partnered with felt they could safely focus on the first database and uphold their value of privacy by removing any dependency on the second database. Embedding explainable AI into your workflow In the three different problems from the examples above, three different approaches to explainable AI were used. This shows that explainable AI is not just a single trick you can apply to each use case; it is a whole subfield of AI that is waiting to be mastered. The main takeaway is that even though it is essential to have concrete metrics such as F1 scores and accuracy, we must also take the time to get to know and understand our models. No metric will capture the insights you can gather from opening up your model and getting an understanding of why it makes the predictions that it does. To make the most of explainable AI, we use a 6-step transparency guideline at Slimmer AI when diving into a new AI adventure: 1. Agree on the goal Before doing any data prep or model selection, align both engineers and business stakeholders on what the model is going to optimize. When building a predictive engine, how do we represent each category? And what are the business consequences if the model misses an instance or when it makes a mistake? The answers to these questions will determine how the model should be fine-tuned and help establish your first step towards a more transparent model. 2. Explore the data Next, get to know your data through descriptive statistics and correlation plots. This will already provide some insight on class distributions, outliers and most relevant features. The results from this step will often cause you to refine the goal from step one. If the data contains personal information, rank the risk of using each feature or specific samples with respect to discriminative outcomes. Unfair bias should be checked at the end of the development cycle, but now is the time to make a first selection. 3. Start simple It’s best to start off with the simplest baseline model you can think of to avoid overfitting on your data. Additionally, simple models are often more interpretable out of the box. Although customers and marketing may love the idea of a deep neural network, if a good ol’ linear regression can do the job just as well, then it’s best to stick with the basics. 4. Get some perspective Before presenting evaluation scores to your team or other stakeholders, look at the relative importance of each feature. Does this make sense? Does domain expertise acknowledge their importance? Could any of these features indicate unfair bias? Present these global model explanations alongside your evaluation scores and bias report to stakeholders to improve the model’s trustworthiness. 5. Zoom in Next, look at the local explanations of both correctly and incorrectly predicted samples. Is it plausible that the samples were misclassified given these features and their values? Often your features can be further fine-tuned based on the information you gather during this review. Once you’re satisfied with the model’s results, report indicative sample explanations so that stakeholders gain an intuitive understanding of the model’s limitations. If possible, present explanations alongside predictions once your model goes into production. 6. Monitor Once your model is in production, make sure to continuously check for changes in the input data which could cause your model’s performance to degrade over time. Monitoring data drift and real-time performance will increase confidence in the model’s robustness. Graphic by Author Closing thoughts I hope these examples have made you excited to embrace explainable AI in your own workflow. It’s a topic I’m very passionate about and at Slimmer AI we are constantly monitoring and testing novel ways to improve our models’ transparency. Our current R&D focuses on whether some of our high performing tree ensembles could potentially be replaced by a more white-box Explainable Boosting Machine. What is your favorite technique? And do you have any other examples where explainable AI saved the day? Please comment here or reach out to me to discuss!
https://towardsdatascience.com/explainable-ai-in-practice-6d82b77bf1a7
['Ayla Kangur']
2020-12-29 17:59:55.772000+00:00
['Machine Learning', 'Editors Pick', 'Model Interpretability', 'Artificial Intelligence', 'Explainable Ai']
Learning to Live with Impermanence in the Age of COVID-19
Had to change your plans lately? The escalation of the coronavirus pandemic and the spike in cases of COVID-19 is causing us all to revise our aims for the immediate future, against a background of global unease that is verging on global panic. Every day we learn of more government restrictions, stock market slumps and other scare stories. As we try to adapt to working from home, cancelling social engagements and the sight of health workers sealed in plastic, it’s not surprising that we’re getting a bit anxious about the future. So what can we do to keep our cool? Activities such as cooking, sewing and gardening can help, as well as sweeping, since these repetitive diversions help to soothe the mind. Yet to understand how we get caught up in the fear and hysteria, it’s worth taking a look at the three characteristics of Buddhism — suffering, non-self and impermanence — to help our brains cope with, and hopefully detach from, the onslaught of information that is often conflicting. A crash course in Buddhist concepts The notion that ‘all existence is suffering’ is anathema to hedonists, who feel that life can be fun too, and yet the idea is not as depressing as it sounds. If you substitute ‘suffering’ with ‘discomfort’, it’s easy to see how many of our actions are aimed at relieving discomfort, or avoiding suffering. Try this experiment: as you read on, keep a part of your mind aware of any shifting of your body — crossing your legs, folding your arms, leaning forwards or backwards. These instinctive movements have a simple purpose — to make us more comfortable and relaxed. When we break up a destructive relationship or quit a boring job, we are acting on the same instinct to avoid further suffering. ‘The self is an illusion’ is another tough concept to get our heads around in this selfie-saturated era. Most cultures bring kids up to take pride in who they are, to be socially popular and to show off their skills to the world. Yet much as we would like it to be so, our own self-image is never quite the same as others see us, and most people, including world leaders, have as many detractors as supporters, showing the elusive nature of the concept of self. If we could all be more selfless and humble, imagine how pleasant a place this world could be. ‘Impermanence’, or the notion that nothing lasts, should be an easier concept to grasp, since everything has a beginning and an end, like today. I write ‘should’, but the reality is that most people on this planet strive constantly to achieve stability, constancy and continuity, especially in their work and relationships. This flies in the face of the observable reality that all things and people that come into being, including you and I, also expire. A flower’s blossom, a flaring sunset, a new taste for the palate and a surprise birthday party are all ephemeral, though fortunately so are bodily wounds, job rejections and arguments with loved ones. A personal history of impermanence
https://medium.com/age-of-awareness/learning-to-live-with-impermanence-in-the-age-of-covid-19-d1c0d5a3e311
['Ron Emmons']
2020-04-19 06:58:00.934000+00:00
['Buddhism', 'Impermanence', 'Covid 19', 'Letting Go', 'Coronavirus']
Misconceptions of Black Magic and Jinn Possession (Part 1)
Black Magic! Jinn Possession! These are terms which spontaneously roll off people’s tongues when countering supernatural or unexplainable situations. Muslims (especially those from more traditional cultures) can be quick to attribute mental illness to Jinn possession. It is indeed possible for Jinn to possess humans, but it must be known that this is extremely rare. In the vast majority of cases, mental health illnesses are natural and not the result of Jinn. Therefore, we must avoid jumping to the conclusion that everyone with a mental illness is possessed. Certain mental illnesses such as depression and hysteria can influence our thinking patterns and can make us see the very worst in a situation and jump to conclusions. Muslims with depression often become caught up in the idea that they are possessed or someone has done black magic on them and end up neglecting methods of treatment which would cure their depression. Likewise, hysterical disorders exemplify physical symptoms based on psychological causes resulting in similar symptoms to possession e.g. a rape victim will develop paralysed legs for the fear of being raped. Jumping to conclusions about black magic and jinn possession will cause more trouble than good and waste time in seeking the right treatment that will actually help a person. It is best to rule out other possibilities before worrying about Sihr. You should always start from the basis that you are not possessed and that your illness is a natural phenomenon. If you start always from this foundation and pursue the relevant treatments then your illness will most certainly be cured. Being consumed with thoughts of possession is very distressing and will only serve to make you feel worse. The same approach that you would take with a physical illness is to be taken with mental health: approach it from both a medical and religious perspective. Do not abandon one for the other. This is the teaching of the Prophet Muhammad (peace be upon him) and of the Qur’an. The Propeht (peace be upon him) said: “Strive to pursue that which will benefit you and seek refuge with Allah, and do not feel helpless.” In order to understand and handle jinn possession, black magic or even evil eye, we must be aware of what they are and their realities. Only a handful of people such as ignorant philosophers, doctors or those who have no knowledge of jinns deny their existence. Majority of people believe in jinns but their beliefs vary depending on their culture, individual nature and level of knowledge. Many believe jinns can be seen and can take the form of animals such as snakes, dogs, cats etc. They can even take the shape of a human, though their feet remain in the form of a goat. Others claim that the jinn’s kryptonite is wolves. Apparently wolves have the power to overcome jinns and that jinns fear them. Jinns also have the ability to live for thousands of years and know the unseen. Some of these claims may be true whilst others may not be. These beliefs need to be weighed up against the Quran and Sunnah in order to identify what is acceptable and what is not. Firstly, we need to realise what Jinns are. Jinns are creatures of Allah created before mankind from smokeless flame of fire. “And the jinn: He created from a smokeless flame of fire.” (Qur’an 55:15) They have some characteristics similar to humans such as thinking, reflecting and reproducing, but due to their origins they are hidden from human sight. They have the ability to possess a human body, inflicting pain and suffering such as psychological and nervous diseases (insanity, depression, anxiety, tension, and epilepsy), personality disorder, physical sickness, hallucinations etc as the Prophet S.A.W said: “The Shaytaan flows through the son of Adam like his blood.” (Bukhari and Muslim) There were other cases to prove jinns can possess a human body. It is reported that the Prophet met a boy who was possessed. So the he began saying, “Come out, O enemy of Allah. Come out, O enemy of Allah” after which the boy healed and became normal (Ibn Majah no.3548). However, it is not easy for a jinn to possess a human body and cause physical harm because he will be exposing himself to punishment by means of the Quran. Even when they take form of other creatures, they are subjected to the same rules which apply to that creature. So if a jinn takes the form of a snake and you kill it, then he will be killed, as mentioned in several ahadith of the prophet in Bukhari and Muslim. For this reason jinns won’t harm humans unless they are positive that he/she is negligent in remembering Allah and is far away from deen. Still, we must remember that there aren’t any hard and fast rules about how or why jinns possess or harm humans and cases may differ from another. Some examples of why jinns may harm humans may be: simply because the jinn is bad and wants to wrongfully harm humans, for no purpose, just as normal foolish humans would. It could be (and there have been cases) that a jinn falls in love with a human and desires him/her so the jinn possesses him/her. Sometimes jinns would punish humans as a consequence of their actions i.e. a person may cause trouble to a jinn unintentionally by falling on him, throwing something at him, urinating on him etc, so the jinn punishes him more than he deserves. Witchcraft and sorcery are also common ways to manipulate jinns into afflicting harm on people, especially in our Asian community. There is no denying that regions such as India, Pakistan and Bangladesh dwell heavily in the art of black magic. So, what is black magic? In a nutshell, it is the use of negative energy and powers, summoning devils and shayateen (plural of shaytan) by evil minded people who are dedicated to harm or deprive others from living peaceful lives through sickening means. Black magicians would generally summon evil spirits by performing sacrificial rituals of humans or animals, depending on the severity of the magic. Once they acquire control over the spirit they are able to cause chaos in any aspect of people’s lives whether it is wealth, career, unexplainable health problems, family problems/tensions, mental illnesses, marriage problems, abnormal behaviour and in extreme cases; death. They experience an evil kind satisfaction at the expense of others hardship. Such horrid people are generally driven towards this sickening act due to jealousy, greed, selfishness, negativity and the inability to acknowledge other’s happiness and growth. This really is a huge problem in our society and unfortunately it has escalated in the last few years. The worst thing about it is that the main suspects are usually relatives, close friends, associates and even siblings (This is not always the case, but generally in Asian communities this is how it is). There are serious consequences for those who dwell in black magic such as coming out of the fold of Islam as the Prophet S.A.W mentioned:
https://medium.com/inspirited-minds/misconceptions-of-black-magic-and-jinn-possession-part-1-9d1c7c1899ec
['Inspirited Minds']
2015-12-07 21:57:20.548000+00:00
['Islam', 'Mental Health', 'Muslim']
The Importance of Suzy, and How I Found Out
The Importance of Suzy, and How I Found Out A short story Photo by Jessica Rockowitz on Unsplash I met Sam at an animal welfare conference. We sat at the same table over dinner and got talking. He was kind of shy at first, but after a few false starts, the conversation really began to flow. He was good looking in a rugged sort of way and when he smiled I was hooked. I can’t remember what we ate. All I can remember is the feeling I got being with him, and hoping he felt the same. I really wanted to get to know this man better. I had just come out of a difficult relationship and was trying to stand on my own two feet again and regain some confidence. My ego had been severely dented. This chance meeting with Sam gave me hope that there might be a good future for me after all. Over the next few months, we spent time together mostly at the weekends. Sam worked on his father’s farm and it wasn’t always easy for him to get time off, especially as we were approaching the harvest season. I wondered whether I should offer to help, but I thought I would be more of a hindrance as I didn’t have much of a clue about farming. I had once driven a tractor on my uncle’s property, but that was a long time ago. Though I didn’t want to get too involved too quickly, I could feel myself being drawn to Sam more and more. I looked forward to our dates more than I had ever done in relationships in the past. We never ran out of things to say, and we shared a lot of similar ideals. He was keen on changing farming practises to reduce carbon footprint and had already persuaded his father to set up wildlife corridors across their land with improved hedging. It wasn’t easy to change the habits of a lifetime, but with gentle persistence, Sam was making significant progress. Inevitably, the day came when Sam wanted me to meet is parents. He had already told me that his father was a lot older than his mother, having married at the age of 45. I wasn’t sure if I was looking forward to sitting around the table and having Sunday lunch with them, but I guess it had to happen sometime. The house looked quite impressive from the outside, but as we got closer I could see that the window frames needed painting. The big front door was made of oak, and according to Sam, it hadn’t been opened for years and was stuck firm. Sam brought me in through the back door which led straight into the kitchen. I saw a tall slim woman by a big stove on which several pans were steaming. She turned as we entered. “There you are”, she said with a smile. “I’ve been looking forward to meeting you, Rachel. We’ve heard a lot about you”. “It’s good of you to have me round for lunch. Sam talks a lot about your great cooking!” “Sam, why don’t you show Rachel into the sitting room to meet Jack, while I finish off here. We’ll be eating in about 20 minutes”. “Sure, Mum”, he answered and took me by the hand and led me down the corridor and through another door. There was a blazing fire in the grate, even though it really wasn’t cold. Jack was standing by the mantle piece. He was a stocky man with red cheeks and bushy white hair, but I could immediately see the likeness to Sam when he smiled at me. “Rachel, you look just as I imagined you would! Sam described you very well. About time we got to meet the girl in his life!” “Dad! Stop embarrassing me”, Sam said. Jack came toward me and shook my hand vigorously. “What would you like to drink, Rachel? We usually have a glass of sherry before lunch on Sundays, but we’ve got some fruit juice if you prefer”. I thought a sherry might calm my nerves so accepted gladly. Soon we were chatting about the plans for stocking the lake with trout, another of Sam’s new ventures. Called to lunch, we took our seats at the big kitchen table. “I hope you don’t mind, but we don’t use the dining room generally. It’s much warmer and cosier here and I don’t have to carry the food all the way down the corridor”, Sam’s mother Margaret said. “It’s lovely here”, I said, and I meant it. I was beginning to feel quite relaxed and I was enjoying the traditional roast dinner. Curiously, on a couple of occasions, I noticed Jack and Margaret exchanging glances over Sam’s head. The first time it happened, I had been talking about my childhood and my school days. Oddly, they had asked me if I had sung in the school choir. When I said yes, that was when I noticed that strange glance, as if they were confirming something which they had privately discussed earlier. As Sam drove me home that afternoon. I wondered if I should ask him what was going on, but thought better of it. It was quite a big event for him, taking me to see his parents for the first time. “Sam,” I said, “they are really nice people, and they seemed genuinely pleased to see me.” “I knew they would like you”, he grinned. As the days began to shorten and winter approached, Sam was less busy and we saw each other more frequently. Sunday lunches were either at his parent’s house or in the pub down the road. I met up with some of his friends from agricultural college. One lad, John, who Sam hadn’t seen for about six years, kept staring at me which was very disconcerting. It was soon after that I heard about Suzy for the first time. Sam told me they had met at nursery and were best friends all through school. “So where is she now?” I asked, thinking she must have moved away or we would have come across her in the village. “She had a bad accident last year. She fell off her horse and she went away for some specialist treatment. She should be back soon and we’ll definitely arrange to meet up”, he said.
https://medium.com/illumination/the-importance-of-suzy-and-how-i-found-out-7b43a75e0636
[]
2020-12-27 20:04:24.363000+00:00
['Creative Writing', 'Illumination', 'Fiction', 'Short Story', 'Writing']
10 Best Literary Villains You’ll Love to Hate And Hate to Fall For
10 Best Literary Villains You’ll Love to Hate And Hate to Fall For Villains are my weakness…literally and figuratively. Photo by JJ Jordan from Pexels I have noticed a strange thing about myself: while reading a long book series, I don’t particularly love the main characters. The story spends so much time with them that they feel slightly boring. So I find myself waiting for someone else to show up in the story, which is most of the time the bad guy. If written well, the baddies have the power to make us feel more than the good guys. The villains make us support the main characters, we want to cheer them on just because we don’t want the bad guys to succeed. For example, being inside Harry Potter’s head felt somewhat tiring at times, and I impatiently waited when Draco would come and throw an insult. You won’t see me admit it in broad daylight, but I liked reading about Malfoy’s character arc more than Harry’s. Similarly, so many times I feel amazed by the villain’s backstory, disgusted by their characters, grudging respect for their cunning. So, here is a list of perfect villainous characters in literature you would hate with a passion, or secretly love despite yourself.
https://medium.com/books-are-our-superpower/10-best-literary-villains-youll-love-to-hate-and-hate-to-fall-for-8268e4657205
['Nusrat Nisa']
2020-11-05 00:09:49.821000+00:00
['Book Recommendations', 'Villains', 'Books', 'Personality', 'Reading']
Biden’s Inauguration Should Make a Statement
Biden’s Inauguration Should Make a Statement The pandemic is an opportunity to mark the arrival of a new era DoD photo by Senior Master Sgt. Thomas Meneguin, U.S. Air Force, Public domain, via Wikimedia Commons Inauguration Day 2021 will mark the start of a new era, one with Joe Biden leading us through the pandemic and economic recovery. The road ahead will be difficult, so what better way to start the journey than by making a statement through how the inauguration is celebrated. The two most important parts of the statement are unambiguously recognizing the seriousness of the pandemic and signaling that the response to it, and to the economic havoc it has caused, will be massive, compassionate, and science-based. Unique Circumstances Unless you are Donald Trump, the pandemic has changed your life. You are socially-distancing and fully aware that reckless conduct on your part may infect and kill others. Thus, you live mainly as Joe Biden did during a presidential campaign, avoiding crowds, wearing a mask, and encouraging others to do the same. The inauguration should follow this pattern by being more of a virtual event than a live one. The massive crowds of people, anxious to celebrate the end of Trump, must be discouraged from coming to Washington and risking a Trump-like super-spreader event. That can be done, but only if Joe Biden and his team craft an inaugural that respects tradition but offers attendees something extraordinary: The message that the new president will lead us through the current crisis. The Inauguration Day Religious Service Traditionally, Inauguration Day starts with the President-elect attending a religious service. Usually, these are held at St. John’s Episcopal Church, the church that served as a prop for Trump last summer when federal troops cleared away protesters so he could hold an up-side-down bible in front of it. That sad incident makes St. John’s an attractive choice for the service, but Biden is a Roman Catholic. If the new president wants to send a message that he is honest, he will attend a mass at St. Mathew’s Cathedral in downtown Washington. The mass should be sparsely attended, with only the Bidens and Harris’ attending with their immediate family and select other invitees. Social distancing will be practiced. The mass should be broadcast so others can participate as virtual attendees. Coffee at the White House Traditionally, the outgoing and incoming Presidents meet at the White House after the religious ceremony before they ride together to the Capitol for the swearing-in. This might not happen this year. What if Trump has not yet conceded the election? What if he is continuing to mock Biden? And what if, as some predict, he has already left town for good and is watching the ceremonies on several TVs at Mar-a-Lago? The traditional White House coffee is a private event. Given that it has no real role in a virtual inauguration and the relationship between Trump and Biden, this tradition should be shelved for this inauguration. If Trump invites Biden to coffee, this decision should be reconsidered. Don’t hold your breath. Trip to the Capitol Traditionally the president and presidente-elect ride together to the Capitol for the swearing-in. Given Trump’s reckless behavior and refusal to wear a mask, the new president should not share a ride with Trump. I’d rather see Biden in an Uber than riding with Trump in the armored limousine informally known as “the beast.” Swearing-in Ceremony Although it would be uplifting to see Joe Biden sworn in at the Capitol in a socially distanced ceremony, this will be impossible. Even if the event is officially closed to the public, thousands of people will want to get a glimpse of the new president. To minimize the risk of new infections, the swearing-in should be held indoors at the Capitol with only a small group of people involved. Television will allow us to witness the swearing-in and hear President Biden’s inauguration speech. Traditionally, the outgoing president attends the swearing-in. It’s anyone’s guess whether Trump will want to suffer the utter humiliation of watching “Sleepy Joe” take the oath of office. Farewell Ceremony for the Outgoing President Should now-former President Trump be at the Capitol, tradition would have the new President join the Trumps for a farewell ceremony that includes the Trump’s leaving Washington in a helicopter. Because Trump’s departure will be such a happy event, this part of the inauguration ceremonies should occur. All our spirits will rise as we watch Trump fly away. Luncheon with the Leadership If the new administration is to be successful, it must work with Republicans. Thus, the traditional post-swearing-in luncheon with congressional leadership would seem to be necessary. What better way to get things off to a good start? But the lunch would mean breaking bread with legislators who have doubted Biden’s election and who otherwise remained loyal to Trump long after it was clear that he had lost. Add to this the risks inherent in such a gathering. Is giving the appearance of bipartisanship worth the infection risk involved? And does Biden want to eat with Lindsey Graham, Ted Cruz, and McConnell? Maybe this lunch could be pared down to a socially distanced lunch with the top two Democrats and Republicans in the House and Senate. Better yet, why not postpone the lunch for a few weeks and then restructure it as a working lunch, appropriately socially distanced, during which President Biden could discuss the major pandemic legislation he is sending to Congress? A National Town Hall Meeting with the Biden-Harris Team If the luncheon is postponed, perhaps Joe and Jill Biden could catch a quick bite to eat and then move on to a new event that could prove to be the highlight of the day — a national town hall meeting on the four top priorities of the new administration: the pandemic, the economic crisis, climate change, and racial justice. Ideally, the event would be no longer than 90 minutes. The president would make brief opening comments followed by a presentation on the state of the pandemic by several scientists currently serving on the Biden Coronavirus Transition Team. Members of Biden’s economic team when then give a similar overview of the economic crisis and the administration’s plans to “build back better.” This part of the program would be followed by a discussion of climate change and the steps necessary to address it. President Biden could sign an executive order rejoining the Paris Climate Accord at the end of this part of the program. The final part of the meeting would be on racial justice with a discussion led by Vice President Harris. The president would acknowledge the seriousness of the issue and outline his action plan to address it. The public would be invited to send questions to the new administration or recommendations for policies to be considered. A replay of the session would be posted on social media. An Evening of Virtual Celebration Traditional events such as a review of military troops and the inauguration day parade would be postponed to a later, post-pandemic, date. Ideally, Biden would hold a short private meeting with military leaders and then rest up for the night’s celebrations. The evening could consist of an all-star concert staged at the Kennedy Center. Major musical celebrities could appear, each performing a song or two. A prominent film star could serve as master of ceremonies. The audience would be kept safely small and socially distanced. The concert would be broadcast to whatever networks wanted to carry it. A Presidential Thank You and Good Night At around 9 p.m., the evening would close with the president giving a short address, thanking fellow Americans for their support, and reiterating his message of hope. With luck, by the end of the day, nobody would have contracted the coronavirus as a result of the inauguration, and all of us would have a better understanding of the administration’s priorities for its first 100 days in office.
https://medium.com/politically-speaking/bidens-inauguration-should-make-a-statement-3425c47e506
['John Dean']
2020-11-14 20:42:45.169000+00:00
['Politics', 'Policy', 'President', 'Coronavirus', 'Biden']
Hiring the Right Type for the Task
by Tim Gouw on Unsplash Imagine you’re a highly-focused software engineer. Detail-oriented, concerned primarily with getting the job done right. At times, others might be a little frustrated with your cautious approach, but the end product is invariably of high quality. You can’t stand interruptions, noise, and useless chatter. Then somebody from sales comes over and sits down right next to your cubicle. Not only are they loud, but they don’t know- and apparently don’t seem to care- that their stories are disturbing the entire work group. As the noise and the laughter pierce your brain, you hunker down even more, ignoring the sales guy. It’s nearly impossible to concentrate, and as a result you may make mistakes. For you that’s a major problem. That’s completely unacceptable. The sales guy doesn’t care. He’s still trying to get you to listen to what happened this past weekend. It’s hilarious!!! No. It isn’t. It’s invasive, annoying, and affecting your work quality. But you continue to do your level best to concentrate over the guy’s noise. If you have to tolerate this too much more, you’re going to contact your recruiter. Photo by rawpixel on Unsplash The sales guy, on the other hand, is going out of his mind. Stuck in a back office with menial, boring, repetitive tasks, he’s dying for interaction. If I have to back there and spend another eight hours on reports, I’m going to quit tomorrow, he thinks. He’s a top performer, but his boss has been handing over maddeningly mindless responsibilities that have nothing to do with sales- at least in his opinion. I’m wasting my time here. Two offices down along the edge of the building, an employee is having a tiff with one of the senior managers. “You need to plan the company picnic,” the manager says. “It’s key to morale. I’ll give you last year’s program, and you can use that as a guideline. Make sure you talk to everyone about what kinds of food they want and their favorite activities. We want everyone involved.” The young man balks. “I have deadlines,” he argues. “I don’t have the time to go around and talk to every single person in the department. I won’t make my goals. That’s going to make me look bad.” “This is for the entire department,” the manager argues back. “If people aren’t happy, they aren’t going to work hard. You need to think about the larger needs of the company.” The young man scowls, takes the paperwork and resentfully leaves her office. This ridiculous chore is an incredible waste of time, he thinks. She doesn’t realize how important my work is to the department. The manager closes the door of her office. I hate this job, she thinks. Two months previously she was promoted out of her job as a team member, which she loved. Surrounded by people she knew well and respected, she now operates solo. She’s four states away from her friends. Isolated in her office all day, she misses the comfort, safety and routine of her team responsibilities. She’s already thinking about putting her resume together. Photo by bruce mars on Unsplash Hire Smart and Manage Well=High Productivity Every day in companies everywhere, people end up angry, or they leave companies because they are placed in the wrong position for their natural gifts. When we hire people, part of our responsibility is to ensure that our employees land where they will bloom. That’s how we get their best work. The famed psychologist Carl Jung developed what would become the basis for personality archetypes. Those basic archetypes inform the many versions of styles training that we see today: DISC, Social Styles, Myers-Briggs and many others. The London-based training company Speak First (https://www.speak-first.com/)divides those archetypes into animals: Owl, Lion, Monkey and Horse, which makes remembering each style much easier. Being able to recognize each style, understand their work preferences and place them in conditions that allow them to maximize their best preferences is part of a manager’s job. That begins with hiring appropriately for each position, and understanding the work preferences of each style. A bad fit means poor work, resentment and guaranteed turnover. Photo by Stefan Steinbauer on Unsplash For example, in the situations above, our software engineer is an Owl: detail oriented, works best alone where he can concentrate on the minutae, quiet, focused, and superb at ensuring that everything is done right. Loud, talkative people are a bane to his existence. The salesman, on the other hand, is a Monkey. Energetic, enthusiastic, superb at connecting with people, they are perfect for sales, public relations, marketing. They are the company party, and they have a very high need for applause and recognition. Isolating them in a back room with menial tasks drives them insane. Photo by rawpixel on Unsplash Our manager above, is a Horse, concerned with the welfare and happiness of her entire department. Her desire to have a company picnic is part of how she ensures a welcoming and happy atmosphere for her group. A great listener, she’s primarily interested in whether people are happy at work. When she was promoted to manager and moved away from her beloved team, it wasn’t a gift. If anything, she misses her “herd” more every day. She would like her new job a lot better if it were designed in such a way to allow her far more interaction rather than being stuck in her office most of the time. Photo by Marius Ciocirlan on Unsplash The young man our manager has assigned the picnic to is a Lion. With a strong ego and need to achieve, he is impatient about other’s needs, when they don’t coincide with his own. Motivated by getting his goals accomplished, the time-consuming process of engaging everyone else can be a burden. He feels out of control of his own destiny when asked to do tasks that he believes sucks time away from achieving his goals. Focused far more on the task at hand, people’s feelings and emotions sometimes get in the way, as far as he’s concerned. He plans to run the whole department one day. Each of these people has particular likes, dislikes and strong tendencies. When folks like the Owl and Lion get stuck having to deal too much with people and feelings, their work satisfaction can plummet. Horses and Monkeys feed off other’s energy and happiness, but for different reasons. Understanding what these style archetypes prefer is a huge part of the hiring process. When we develop a job description, the duties will tell us what kind of person will thrive in that position. When we’re interviewing we’re looking for like tendencies. The same thing happens when we move, promote, and place people in their work stations. A perfect example of a classic promotion mistake is to put a superb salesperson in charge of a sales team. As someone who has trained sales for decades, I see this mistake all the time. A terrific individual contributor, your salesperson loved her job, her achievements, her perks and all the applause she got for nailing the whale clients. Suddenly now she’s a baby sitter for ten salespeople. This is not her primary skill set. As a born storyteller, she isn’t all that interested in their issues, problems and stories. She wants to tell hers around the donut box just like she used to. Now she’s stuck doing administrative duties which she despises, and she’s no longer in the field selling, which is her greatest gift to the company. She loves the title, hates the job. by Asa Rodger on Unsplash Archetypes are guidelines. None of us is locked in stone, and one of the demands of good employees is to learn to stretch and adapt. However, if that employee is simply not in the right position, no amount of coaching is going to help them make fundamental changes to who they are. To wit: as someone who is a combination Monkey/Horse, I thrive on audiences and relationships. The Army, in all its wisdom, assigned me to be an administration office shortly after I graduated from Officer’s Candidate School. I was isolated in a small dusty office, and given reams of paper to review and assess. I went out of my mind in a week and failed horribly at the job. I did my level best, but I was miserably mis-assigned. My commanding officer wasn’t pleased, but I got moved to the television studio, where I flourished and succeeded. If you’ve got people whose performance has just plummeted, or have noticed that your new hire isn’t adjusting, it may well be that they’re not a fit for the position. Or, by promoting someone into a new role, you might have taken them out of the perfect spot and into a new role for which they are neither suited nor do they enjoy. This is how we lose good people. The best managers take the time to consider personality archetypes when they hire, and they also ensure that the working environment allows each of those styles to maximize their gifts. This way our people feel valued, are able to put their best skills to work, and everyone wins.
https://jhubbel.medium.com/hiring-the-right-type-for-the-task-679f7192f2ef
['Julia E Hubbel']
2018-07-10 13:56:27.268000+00:00
['Work', 'Management', 'Employee Engagement', 'Productivity', 'Hiring']
Annual Report: Year 2 of Pictal Health
Last year I put together an annual report of progress and pains related to starting a company. Now, a year later, what’s changed? (Apart from the global pandemic that has upended life as we knew it.) Accomplishments I’m now working with my 54th health history visualization client ; I’ve had 32 paying clients, and 22 pro bono clients. That adds up to at least 550 hours of direct client work, probably more. I’ve learned a ton in that time, and I try to share insights continually here on Medium. ; I’ve had 32 paying clients, and 22 pro bono clients. That adds up to at least 550 hours of direct client work, probably more. I’ve learned a ton in that time, and I try to share insights continually here on Medium. Last summer I worked with 10 veterans with complex health issues as part of a pilot project with the VA. It was a lower-technology process, and I loved getting know the veterans. I wrote this article up to summarize my experience: How visual health histories can help military veterans with complex health issues as part of a pilot project with the VA. It was a lower-technology process, and I loved getting know the veterans. I wrote this article up to summarize my experience: How visual health histories can help military veterans Pictal Health was technically profitable in 2019. The numbers were small, but hey, it’s something! in 2019. The numbers were small, but hey, it’s something! I went through the painfully detailed process of applying for an SBIR grant through the NIH — this is a government grant for small businesses. I’m currently awaiting a decision. through the NIH — this is a government grant for small businesses. I’m currently awaiting a decision. During this pandemic, I’ve been taking time to get into detailed design and prototyping for Pictal App — a way for people to compile their own health history timelines. and prototyping for Pictal App — a way for people to compile their own health history timelines. I worked with Ron Herman at TeamFound to compile requirements and documentation for Pictal App and interview many potential development firms around the world. and interview many potential development firms around the world. Anne Miller helped me compile a giant ‘pro forma’ financial projections spreadsheet. It helped me think more critically about business models, the resources I would need to grow, and what kind of company Pictal Health might become. Challenges Working alone continues to be one of my biggest challenges. Here is what I wrote last year: “In the past, I’ve felt most alive and happy when I am collaborating with a good team and doing work at which I excel. Being isolated from other people and from the work of being a designer has been my biggest personal challenge and has resulted in extreme emotional lows at certain points.” I’ve had some helpful collaborations along the way, but it’s no replacement for working with a team. And, since I get a lot of energy and motivation from working with others, I find my energy flagging in isolation. continues to be one of my biggest challenges. Here is what I wrote last year: “In the past, I’ve felt most alive and happy when I am collaborating with a good team and doing work at which I excel. Being isolated from other people and from the work of being a designer has been my biggest personal challenge and has resulted in extreme emotional lows at certain points.” I’ve had some helpful collaborations along the way, but it’s no replacement for working with a team. And, since I get a lot of energy and motivation from working with others, I find my energy flagging in isolation. Business model : finding the right business model is an utter slog, especially in healthcare, where the people who pay for a product are often not the ones who enjoy its benefits. : finding the right business model is an utter slog, especially in healthcare, where the people who pay for a product are often not the ones who enjoy its benefits. Burnout : Especially in the last year, I’ve found myself burned out on aspects of this work, and at times I’ve needed to take a step back and focus on something else for a few days or weeks. It happens. I try to give myself the distance and then come back to the work with new energy. : Especially in the last year, I’ve found myself burned out on aspects of this work, and at times I’ve needed to take a step back and focus on something else for a few days or weeks. It happens. I try to give myself the distance and then come back to the work with new energy. Le finances: it’s been a personal struggle to continue having low income and relying so much on my husband to cover our basic costs of living. I also need money to build and launch Pictal App; I may try to raise money, once I find out about the grant. In the meantime, I’m looking for interesting side work as a user experience designer. A few needs Team : to be successful, I need to stop working alone. I’m starting to look around for folks with complementary skills who might be cofounders or early employees. Skills and attributes I am looking for: marketing, sales, good listening, business strategy, partnerships, integrity, hiring and management, healthcare strategy; also a separate person who has more of a technical background and can help build and guide the product. : to be successful, I need to stop working alone. I’m starting to look around for folks with complementary skills who might be cofounders or early employees. Skills and attributes I am looking for: marketing, sales, good listening, business strategy, partnerships, integrity, hiring and management, healthcare strategy; also a separate person who has more of a technical background and can help build and guide the product. Evidence : I applied for Mass Challenge a few months ago and was not accepted, and the most common feedback I got was that I needed to gather more evidence about the impact Pictal Health was having and its usefulness to doctors. I am involved with two (possibly three) papers being submitted to academic journals that will help establish health history visualization as a valuable practice. In the future, I’d need to do a more formal clinical study to measure specific outcomes of this work. : I applied for Mass Challenge a few months ago and was not accepted, and the most common feedback I got was that I needed to gather more evidence about the impact Pictal Health was having and its usefulness to doctors. I am involved with two (possibly three) papers being submitted to academic journals that will help establish health history visualization as a valuable practice. In the future, I’d need to do a more formal clinical study to measure specific outcomes of this work. Energy: I can’t understate the importance of finding a source of ongoing energy. Working with clients often gives me energy and purpose, so I might try to find ways to work with more of them. Creative pursuits, mentorship, and social connection also give me energy, so I will find ways to keep up with those things. I am grateful for: My husband, who has supported us the last two years financially (and who is a huge emotional support for me) My family and close friends, thus far all safe and healthy My clients, who have taught me so much Collaborators: this year, notably Ron Herman (technology) and Anne Miller (tech commercialization and spreadsheeting) All the great people who have donated time and advice to me in the last year or two, notably Geoff Strawbridge (Marketing), Sam Roach-Gerber and David Bradbury at VCET, Gwen Pokalo at Center for Women and Enterprise, Bonnie Pratt and the women of the CWE Trust.IT group, Sameer Sood, lots of advisors and mentors through the LaunchVT program, and all of the rest of you out there, you know who you are My monthly infusions of immunoglobulin, which keep me strong Living in Vermont and being able to get outside & eat local food during the pandemic Dark chocolate, still available during these lean times One more thing Black lives matter, and I and Pictal Health stand with the brave protesters in our country and around the world who are fighting for racial justice and speaking out against racism, police brutality, and the murders of George Floyd, Breonna Taylor, Ahmaud Arbery, and countless others. I wrote a post over at the Pictal Health website about more actions I’m taking, if you’re interested.
https://medium.com/pictal-health/annual-report-year-2-of-pictal-health-a8de7de13e2
['Katie Mccurdy']
2020-06-16 16:25:41.596000+00:00
['Startup', 'Healthcare']
Ground control to major policy makers: Unlocking user-centric AI’s potential for development — Part 1
While a human can generally classify which parts of these images are green space, water, or even augmented structures, the sheer amount of imagery makes it impossible for human analysis to scale without algorithmic assistance. Artificial intelligence (AI) algorithms can help automate this process, making it more efficient. Decision-support platforms can connect stakeholders with the outputs of AI imagery analysis to help them understand the land cover composition of urban areas and where land use changes have occurred. AI Cooking While generating satellite image classifications with AI is both art and science, for the end user, it boils down to a dish made of three main ingredients: satellite imagery, labeled “ground truth” data (i.e., “labelling” the class of interest), and AI algorithms. The question is then whether the results are fit for purpose. Do they provide some useful insights at an acceptable level of accuracy and resolution for the user on the ground? These considerations change the required ingredients. In tech speak, user-centric means getting something that ultimately addresses an issue or process that works for an actual set of people, potentially with limited technical specialization or just basic digital literacy. The use case refers to the insight or decision that users can actually apply the tool to in practice. For example, if the particular use case demands discerning whether solar panels are on roofs, higher resolution imagery will be needed as opposed to if the objective is to get an overall sense of green spaces. But putting this tool directly in the hands of users is likely to stimulate creativity around issues that matter to people on the ground! Satellite imagery from different satellites can vary by spatial resolution (e.g., some satellites capture 30 cm resolution images, 1000 times the resolution of the Landsat images shown above), spectral resolution or which bands of light are captured by the satellite (e.g. red, green, blue, and other non-visible bands such as infra-red), and temporal resolution (e.g. how often a satellite revisits a certain location). Labeled “ground truth” data refers point/pixels that have been correctly classified into a given scheme by human analysts. Finally, different AI algorithms must be trained to perform the desired classification task. The fact that many PhDs are now being earned based on which models are most fit for purpose for any given issue suggests that most of us will be looking to get what works off the shelf. Find out in part 2 of our blog how satellite technology and data can be used by real people for real life applications. The image below is a land cover classification available online from NASA’s Moderate Resolution Imaging Spectroradiometer (MODIS), with each pixel covering an area of 500m. The image below shows urban concentration not just within Ho Chi Minh’s boundaries in 2011, stretching into the Mekong delta. User-centric tools can now help most everyone derive granular and timely insights online.
https://medium.com/world-of-opportunity/ground-control-to-major-policy-makers-unlocking-user-centric-ais-potential-for-development-992e174bd528
['World Bank']
2020-01-13 19:46:53.238000+00:00
['AI', 'Technology', 'Governance', 'Satellite Technology', 'Data']
Things You didn’t Know About JavaScript Types
Things You didn’t Know About JavaScript Types Oskar Follow Jul 11 · 6 min read Photo by JESHOOTS.COM on Unsplash In the world of transpilers it’s hard to look back and return to the fundamentals, Typescript and Babel eliminated so many problems that JS devs had daily. Types and equality rules of JavaScript were always confusing, and now many of those are abstracted away so you don’t have to worry about them, but that doesn’t mean that they completely disappeared. In my experience, the best way to be great at something, is always to start from fundamentals and build up. JavaScript has two types of values, there are different notations out there, but let’s call them Primitive and Complex. The biggest distinction between them is that primitives are always immutable, while complex values are mutable. There is a popular myth in JS that everything is an object, we’ll explain primitives, see how they behave and why they are not an object. You can always check type of any value using typeof operator. Primitives There are seven primitive types in JS, they are: Number BigInt String Boolean Symbol null undefined You are probably familiar, and use daily Number, String and Boolean. These three are really self explanatory, interesting thing here is that immutability we mentioned, let’s consider this piece of code: let name = 'Oskar'; console.log(name[2]); // prints: 'k' // lets try to set name[2] to 'c' name[2] = 'c'; // the code above ran without errors so let's print name again console.log(name); // prints: Oskar // whoah name didn't change from original assignment Look at the example above, even though you can run code like that without errors, String is a primitive value and primitives are immutable hence you can’t change name[2] = ‘c’ and name variable will point to the same value as it was assigned to in the last assignment. You can prohibit these assignments ( name[2] = ‘x’ ) since they have no effect anyway, and many other sloppy things in JS, by using strict mode. Note that you can always reassign variables, doing this name = 'Oscar' will always work. You can always reassign if you need to, but you can’t mutate value itself, at least not for any of the primitive values. BigInt JavaScript is using floating point math which means that you can’t count on having every number in the world available to you, JS will always try to provide a number that’s pretty close to number you want, but there are no guarantees that you will get exact value you expect. Probably most famous example of this is: let num = 0.1 + 0.2; console.log(num === 0.3); // prints: false What!!! (0.1 + 0.2) === 0.3 is false!!! That’s right, open you browser console and try it out. In JS floating point universe 0.3 doesn’t exist, so if you add two numbers above result will be: console.log(0.1 + 0.2); // prints: 0.30000000000000004 That value (0.30000000000000004) is closest thing that that exists in JS universe to 0.3 , hence that’s returned when you do 0.1 + 0.2
https://medium.com/swlh/learn-the-basics-javascript-types-8854a2d12e29
[]
2020-07-14 10:29:55.541000+00:00
['Programming', 'JavaScript', 'React', 'Software Development', 'Web Development']
How I Learned to Live 2 Days in One
What I Should Truly Feel Jealous Of I felt jealous of achievers’ success. But I conveniently ignored their efforts. They sleep less, they make numerous sacrifices, they show up each day to do the same thing for years. On one side, there were achievers, pushing themselves to do what they had to. On the other side, there was me, complaining that I couldn’t find time after work, that the daily commute was exhausting, my boss was a jackass, my colleagues were mean, the work wasn’t challenging enough — the odds were stacked against me. But the truth is, achievers didn’t have a better life than mine. In fact, many of them faced far worse odds. What I should’ve turned green with envy over was their effort and disciplined attitude towards work. How did (do) they find the time to put in the effort to reach where they are? The Secret of Compounding Results Imagine spending your hard-earned money on things you want but don’t need. Soon, you’ll get bored of them. Then you’ll spend more money on other things which’ll hold your ephemeral attention. And the cycle goes on. Now imagine investing the same money in a portfolio which gives you consistent returns over the next ten years. Which option will leave you better off after ten revolutions around the sun? Which option will make your money grow? Since ‘time = money’ is a common equation, let’s now substitute money with time. Most people spend their work hours — 10 am to 7 pm — on… well… work. They spend their remaining time — 7 pm to 10 am — preparing themselves for work. Preparing includes ‘mental relaxation’ like watching Netflix, scrolling through Instagram, checking their notifications, or all of these. Or, because they can work 24/7, they spend this time working too (or thinking about it). Weekends are even worse. They wake up late, binge-watch the latest seasons, wait in line to eat at crowded places just to post photos on Instagram… they might complete chores and meet friends on Saturday. But by Sunday, they’re filled with dread about a day which is still over twelve hours away. No wonder Sunday afternoon is one of the most unproductive times for most people. Don’t believe me? Just scroll through this hashtag on Twitter. In his seminal book Flow, researcher Mihaly Csikszentmihalyi (Me-hi-ee Cheek-sent-me-hi-ee) wrote, “If left to their own devices and genetic programming, most people do stuff like worry about things or watch television.” But achievers are different. (Of course they are!) They optimize their time at work. But they don’t let ‘work’ or ‘life’ get in the way of their personal goals. For them, time outside work is worth its weight in gold. They make it a rule to invest time on what’s important to them every day — self improvement. This, when compounded over time, magnifies into large proportions. When I compared my routine to theirs, I found why the ‘fame I deserved didn’t come to me.’ Outside work, I spent mind-numbing hours watching television. I kept watching reruns of cricket matches until I learned them by heart. If I didn’t, I felt a void. I may not be fond of Netflix or Game of Thrones, but felt like I was married to the TV. The result? I never found time to do what I thought I should’ve been doing. For most people, their work day is their ‘day’. Their routines revolve around the eight hours of 10 am to 7 pm (30 percent). And when this day is over, they’re left with nothing in the tank. Their mind refuses to engage in a meaningful pursuit and instead, rushes into the welcoming arms of Jadis the White Witch — instant gratification. If you surrender 70 percent of your day to the 30 percent, how will you ever get close to your goals? Okay. Enough admonition. We’re done talking about the illness. Now let’s focus on the remedy. Remember how you went out to play after school as a child. Play was not ‘constructive.’ (Well, technically it was, but what did you know then?) Yet, it refreshed your mind and let you focus on your studies afterwards. Consider the time after work your play time. Here’s how I use it, and so can you.
https://medium.com/the-ascent/how-i-learned-to-live-2-days-in-one-4e928f18fe3f
['Vishal Kataria']
2018-04-11 18:24:35.489000+00:00
['Work', 'Lifehacks', 'Productivity', 'The Ascent', 'Self Improvement']
Dissociation & Trauma in David Lynch’s ‘Lost Highway’
“And your name? What the fuck is your name?” But what happens next is a drastic reconfiguration of Fred’s personality, and further detachment from reality, that tellingly comes when Fred is to become fully accountable for his actions. While on death row, Fred has a breakdown in his cell, screaming and clutching his head while he has a vision of a burning cabin, before seeing the same dark highway from the opening, which here feels analogous to the River Lethe of Greek myth; translating literally as “forgetfulness” or “concealment”, the dead were said to travel the Lethe and drink it’s waters to erase their memories of Earthly life, on the way to reincarnation. This metaphor seems apt when, just before the vision ends, he sees a soon-to-be familiar young man at the road side. The next morning when the guard checks on Fred, that young man, Pete Dayton, has appeared in Fred’s place. The police are baffled and with no explanation of how he got there, they release him. The life Pete leads represents a defensive shift of perspective on Fred’s part, with all the worst elements of his own personality projected onto others. Where Fred was a middle-aged jazz musician living a largely isolated life, Pete is blue collar worker with many friends and co-workers, a loving family and sexually voracious girlfriend. He casts himself as the innocent, and everyone around him as dangerous or troublesome, his life becoming series of noir cliches that it expose it as a fantasy. His worst personality traits — his volatility, aggression and jealousy — the root of his irreconcilable issues, are divorced from him entirely, becoming embodied in the psychotic gangster Mr Eddy; an absurd encounter with a tailgater show just how hair-trigger his temper can be, and the below exchange exposes the depths of Fred’s jealously and possessiveness Mr. Eddy: How you doin’ Pete? Pete Dayton: Okay. Mr. Eddy: I’m sure you noticed that girl that was with me the other day, good lookin’ blonde? She stayed in the car? Her name is Alice. I swear I love that girl to death. If I ever find out that somebody was making out with her, I’d take this… [he pulls out a pistol] …and shove it so far up his ass it would come out of his mouth. Then you know what I’d do Pete Dayton: What? Mr. Eddy: I’d blow his fuckin’ brains out. Most disturbing is the way Fred recasts Renee as blonde femme fatale Alice, the sultry girlfriend of Mr Eddy, who starts an affair with Pete and draws him into a tangled mess of criminality and violence. He goes to lengths to cast her as someone dangerous, and whose mains traits are dishonesty and adultery, the things Fred perceives as Renee’s worst flaws. Yet the more we learn about Renee/Alice, the more we understand how vulnerable and exploited she is. The story she tells of when she first arrived in town — about a job her friend Andy told her to audition for — is a nightmare vision of the kind of casting couch abuse stories that have become depressingly prevalent. She was forced to strip for Mr Eddy (whose actual name is Dick Laurent) at gunpoint while a room full of men gaze at her, and then coerced into a sexual relationship with him. In response to this, Pete only thinks of his own fragile ego, resentfully saying, “So, you liked it, huh?”, to the clearly distressed Alice. Once Pete’s becomes more comfortable in his fantasy, a familiar face returns once again. After making plans with Alice to rob her pimp friend Andy and run away together, Pete receives a call from Mr. Eddy, who happens to be there with a friend: Mr. Eddy: I’m really glad to know you’re doin okay. You’re *sure* you’re okay? Everything alright? Pete: Yeah? Mr. Eddy: I’m really glad to know you’re doin good, Pete. Hey, I want you to talk to a friend of mine. Mystery Man: We’ve met before, haven’t we? Pete Dayton: I don’t think so. Where is it you think we’ve met? Mystery Man: At your house. Don’t you remember? Pete Dayton: No. No, I don’t. Mystery Man: In the East, the Far East, when a person is sentenced to death, they’re sent to a place where they can’t escape, never knowing when an executioner may step up behind them, and fire a bullet into the back of their head. Pete Dayton: What’s going on? Mystery Man: It’s been a pleasure talking to you. The Mystery Man’s deeply sinister line about the Far East could be referring to Fred’s literal execution and death, which is possibly imminent, but metaphorically the ‘place where they can’t escape’ is likely Fred’s own mind and the ‘bullet into the back of their head’ the objective truth of situation. The Mystery Man is telling him that there is no way of fully escaping the truth within his own mind, and the physical and psychological consequences will eventually destroy him. After this the Pete fantasy begins to unravel. He goes to meet Alice at the pimp’s house but is confronted by Andy, who after a scuffle, ends up impaled head first on a glass coffee table. The sickening head wound is a Lynch trademark, used in part to evoke a visceral physical horror, but also a visual metaphor for violent trauma as damage to the mind — moments later a bloody nosed Pete hallucinates Alice having sex with a faceless man, his sexual insecurity resurfacing again. When they arrival in the desert, it’s the same cabin from Fred’s vision in prison. No one is there, so Pete and Alice have sex on the dusty plain, during which Pete repeatedly tells Alice he wants her, only for her to say afterwards, “You’ll never have me” — the resoluteness of her death finally hits him, and Pete turns back into Fred. “You and me, mister… we can really out-ugly them son-of-a-bitches. Can’t we?” Fred heads up to the cabin, only to be confronted again by the Mystery Man, this time holding a cam-corder, the connection between himself, the videotapes and what they represent confirmed. Fred asks where Alice is, only for the Mystery Man to tell him, “Her name is Renee. If she told you her name was Alice, she was lying”. He then aggressively shouts at Fred, “And you…what the fuck is your name?!” The Pete fantasy fully broken, Fred backs away terrified from the Mystery Man, who looms towards him recording the whole time — for Fred, the painful process of integration has began. He goes after Mr Eddy and brutalises him, in events likely similar to what Fred has perpetrated on the real Dick Laurent; by the time he has finished, all that remains is Laurent’s Mercedes — emblematic of Mr Eddy’s temper and jealousy — now recognised by Fred (who drives it) as part of who he is. Assisting him is The Mystery Man, now standing side-by-side with Fred, the truth recognised as something familiar to him. The last shot is Fred barreling down the highway again, now pursued by the police, the awareness of his action’s consequences returning to him, perhaps. But it may already be too late, the loop seemingly returning back on itself, his fate left ambiguous. At the heart of most magic tricks is a deception, and for Lynch dissociation is like a cruel sleight-of-hand that we play on ourselves, an illusion cast across the mind that we let ourselves believe. Although Lynch has said he does not believe pain is necessary to the human condition, he seems to accept it’s inevitability; at some point almost all of us will experience events so painful, so damaging, that we have difficulty reconciling them and dealing with their repercussions. The extreme acts that cause Fred’s trauma are in larger part to support the noir nature of the narrative, because ultimately the way he deals with it would be as troubling regardless of its nature. What Lost Highway seems to say is that our trauma exists in and of ourselves, of our mind and soul, and the only successful way to deal with it is acceptance and integration. To dissociate from reality will not help you escape your trauma, only alter how you perceive it; it never truly goes away from Fred’s mind, only recedes briefly, before resurfacing as something more alien, frightening and difficult to understand than it already was, compounding the problem. To attempt to escape something that is you, that exists of you, will create a paradox, a fight against ever more divergent elements of yourself, the endless loop that we see Fred in. This is the true horror of Lost Highway — that our inability to process our own trauma may be the reason for our greatest fears and nightmares.
https://medium.com/framerated/dissociation-trauma-in-david-lynchs-lost-highway-30aec86ce35a
['James Giles']
2018-11-19 19:17:06.330000+00:00
['Movies', 'Mental Health', 'Culture', 'Features', 'Film']
I’m Riding My Bicycle Across America
I’ve noticed a trend that wellness has become a commodity. Something that we purchase. A line of products promising vitality and debt. Rather than access wellness by mobilizing the tools we come equipped with, we’ve looked to a studio class or an $11 green juice as the yellow brick road to our wellness. That’s diet and fitness, not wellness. You see, wellness stretches beyond the walls of a gym, and doesn’t come with the option of insert chip now. Yes, what I consume and how I move are critical parts of my wellness, but there are several more buckets. To me, wellness is a state of being. It’s life without disease, physical and mental, spent full of actions in alignment with our purpose. My blueprint for wellness includes caring for my mind and the stress it may create, living in community and nurturing relationships, moving naturally and eating plants, sleeping well, meditating and expressing gratitude, being financially responsible and also compassionate. And I’ve known life without wellness and through a series of basic, affordable and convenient changes, I’ve healed. This is why I so passionately advocate for wellness and the truth that it is accessible. Accessible meaning it’s affordable, convenient and it looks like you. My mission is to highlight that we have already have the resources needed to cultivate our own wellness, and bring this into communities across the country in an off-the-screen installation. Paulo Coehlo wrote that “The world is changed by your example, not your opinion.” Well, it’s my opinion that anyone can post about wellness, show a glimpse into their lifestyle through 15 second installments, but it’s how we educate and impact those outside of our captured digital audiences that we spread our truth and in return receive an accurate education on the diversity of communities we must be considering. So to honor the quote by Coehlo, and my belief that I can change the world, I’m here to be an example. That’s why this summer, I’ll be riding my bike across America on The Wellness Ride. The Wellness Ride is an 85 Day Athletic Art Installation designed to educate on the accessibility of wellness and encourage individuals to invest in their own wellness, while also raising meals for the hungry across America. And you can bet I’ll be documenting the intimate moments of this adventure through writing, images and good ole fashion pineapple dances. You will see me riding a bicycle, meditating in nature, eating plants, spending time in community, maintaining my sobriety and offering plant-based meals to the hungry along the way. The goal of the wellness ride is to raise 5,000 meals and empower individuals to invest in their own wellness. Each meal is a delicious plant-based creation of Veestro’s that is $5 and will be donated directly to organizations across the country, and each of you deserves to be well. We’ve created a simple donation page and each week I’ll be sharing the intimate details of this ride if you’d like to hear them. My ask of you beyond donating meals is to speak this ride into your conversations. On a first date, on a facetime, during 2 truths and lie, or when answering How Was Your Day, fill the void of your conversations with tales of The Wellness Ride. Tell the world your friend is biking across America to advocate for wellness and to raise meals. I planted the idea to bike across America 3 years ago, a sticky note in a leather-bound notebook forgotten until this spring when a timely Charles Bukowski quote came across my eyes. “If something burns your soul with purpose and desire it’s your duty to be reduced to ashes by it. Any other forms of existence will be yet another dull book in the library of life.” So here I am, ready to ride. Let’s fucking do this. Richie. Human.
https://rickieticklez.medium.com/im-riding-my-bicycle-across-america-4de9daf18e83
['Richie Crowley']
2020-04-19 17:27:51.722000+00:00
['Cycling', 'Adventure', 'Wellness', 'Charity', 'Courage']
When Your Income Takes a Dip On Medium
In the last four months, I’ve taken a huge income hit on Medium. A year ago, I managed to have my best month of $2200 on Medium. After that, I kept my Medium statistics up for 4 months. Then, in the beginning of this year, I hit a wall. I didn’t have time to write on Medium and my income went down to 25% of what it was. I never expected to have 7K followers on Medium. I’m a female writer who writes in both self-help and technology competing with male writers who are far more experienced than I am. I had to grind my teeth and work hard from day one. I haven’t stopped because as a writer, I’ve found this platform to be a great fit for me. The income dip was a reality check. For a while, my ego was bruised. I busied myself with earning money elsewhere. But, somewhere in the back of my mind, I was disappointed at myself. I doubted myself as a writer. The truth is that the dip was timely. It allowed me to refocus on new strategies while writing on Medium. During social isolation, taking risks in writing is my favorite activity. Here are a few things I’ve learned about what you can do to climb back up if your income has taken a hit. Niche Down And Go Deep Recently, I’ve niched down to a few tried and true categories on Medium. Instead of writing overview types of articles, it’s important to have depth in the categories that you write in. For instance, if you are writing self-help articles once a week, consider writing in depth about one aspect of self-help. For instance, you can write about “empathy”, “starting over”, “make money”, etc… Taking on one topic inside the broader topic will allow you to reach those readers who are truly interested in your writing day after day. Your Brand Is Unlike Others For me, figuring out my brand took a whole year of writing in 14 different categories. You may already know what your brand is. On Medium, it’s about the practice of branding. Know what you love to write about and write your heart out about that topic. I no longer view Medium as place of competition. For me, it’s a place of discovery and a place to refine my own brand. My brand is different than any other person’s brand out there. I may share similarities with other writers and influencers, but because of my unique experiences in life, I will have a different point of view. Sometimes, what I write about may not be completely original or unique, but the way that I convey it will hopefully reach my audience. So, change it up if you are unsatisfied with the reads and claps of your articles. Be Of Service Writing is an act of service. It’s the same as parenting where there has to be an unconditional love for your audience. I write frequently about how writers are coaches whether they want to be or not. This is true no matter how much money you make as a writer and who you are as a writer. The best writers who are successful on Medium through the ups and downs are the ones who are adding value to the ecosystem. These are writers who can write one article a week to earn $1000 a month. They have followers who will read every single one of their article because they deliver. They add value. They put back into the ecosystem and improve the lives of their readers. So, hold yourself accountable by saying to yourself, “How does this article add value for my readers?” Make Hard Choices Topic selection on Medium is important. Writing is important. But, topic determines 50% of the article’s sharability. This means that you have to read on Medium in order to understand what people care about at the moment. This really limits the amount of articles that I can write per a week. I’m not bothered by that. In fact, I prefer to only write 10 articles a month on Medium and make my current income. Then, I will spend time to make these 10 articles count. The hard choices are the below: Rein in my tendency to want to write “viral” topics simply because they are viral. Rein in my tendency to publish for the sake of publishing. Take time to expand on my ideas before writing the article rather than simply putting words down to save time. Make efforts to earn money elsewhere so that money does not drive my decisions about topic selection on Medium. Planning and Time Planning and freeing up time to allow you to think are the two ingredients for quality. No matter what I receive as payment for my writing, I have to have “quality” things to say. Otherwise, spending time writing every day is not worth it. When I make planning decisions, it’s about writing to gain more free time. How many hours can I spend on a quality article that will be shared well to free up time to write my next article? This is how quality begets quality. Planning, outlining and spending time to edit are the keys to freeing up time.
https://medium.com/jun-wu-blog/when-your-income-takes-a-dip-on-medium-b1831ab33c8e
['Jun Wu']
2020-04-25 18:58:08.990000+00:00
['Writing On Medium', 'Self', 'Freelancing', 'Writing Tips', 'Writing']
Do Psychics Prey on Sadness?
A few months after my daughter died, I joined a Facebook group comprised of parents who, like me, want to believe that our child’s soul lives on. There are nearly four thousand members in this group. We have lost children of all ages, in every imaginable way. It’s a place for grieving parents to share signs of their dead children — or our hope of receiving signs — without judgement. One of the group’s main purposes is to connect psychics and mediums with bereaved parents and this is the real reason I joined. I’m always looking for signs. I don’t consider myself an atheist, but I’m not religious, so I was unprepared for the deep spiritual crisis I experienced when my daughter was dying. I had no way to convincingly reassure her that her soul would survive death because, quite frankly, I wasn’t sure it would. I told her that there is a place I believe all our souls go to after we die, a place of joy and solace. I wanted this to be true, but she knew me better than anyone. She saw the doubt in my eyes. She was fifteen — too old for fairytales. After she died, the idea that nothing was left of her — not even her soul — was impossible for me to believe. She felt so close, as though she was at the periphery of my vision or around the next corner. In the weeks after her death, I watched for signs constantly, testing the idea of faith for the first time in my life. I collected feathers, heart-shaped stones, and other trinkets that I imagined she’d left in my path. Still, I doubted that these were actual messages from my daughter. A thought kept running through my head — a desperate heart sees what it wants to see.
https://jacquelinedooley.medium.com/do-psychics-prey-on-sadness-8942c4efffe5
['Jacqueline Dooley']
2019-04-19 15:34:33.986000+00:00
['Mental Health', 'Grief', 'Parenting', 'This Happened To Me', 'Death']
Feast is a Simple, Open Source Feature Store that Every Data Scientist Should Know About
Feast is a Simple, Open Source Feature Store that Every Data Scientist Should Know About The project was initially created by Google and transportation startup GoJek. I recently started a new newsletter focus on AI education and already has over 50,000 subscribers. TheSequence is a no-BS( meaning no hype, no news etc) AI-focused newsletter that takes 5 minutes to read. The goal is to keep you up to date with machine learning projects, research papers and concepts. Please give it a try by subscribing below: Feature extraction and storage is one of the most important and often overlooked aspects of machine learning solutions. Features play a key role helping machine learning models to process and understand datasets for training and production. If you are building a single machine learning model, feature extraction seems like a very basic thing to do but that picture gets really complicated as your team scales. Picture a large organization with dozens of data science teams cranking up machine learning models. Each team needs to process different datasets and extract the corresponding features which becomes computationally extremely expensive and nearly impossible to scale. Building mechanisms for reusing features across different models is one of the key challenges faced by high performance machine learning teams. A feature store is a pattern that is becoming prevalent in modern machine learning solutions. Conceptually, a feature store serves as a repository of features that can be used on the training and evaluation of machine learning models. Despite its obvious value proposition, feature stores are notably missing from most machine learning platforms. Last year, Google joined efforts with Asian’s ride-hailing startup GO-JEK to open source Feast, a feature store for machine learning models. Feast abstracts many of the fundamental building blocks of feature extraction, transformation and discovery which are omnipresent in machine learning applications. The Motivation Like other rapidly growing data science organizations, GO-JEK constantly faces challenges in terms of feature extraction and discovery. GO-JEK’s machine learning models typically reuse common features such as driving time to destination, time of the day or driver profile in order to extract intelligence from heterogenous datasets. Beyond the obvious benefits of feature extraction and discovery, Google and GO-JEK decided to build Feast with some very tangible goals in mind: · Feature Standardization: Feast attempts to present a centralized repository for describing features of machine learning models. This provides structure to the way features are defined and allows teams to reuse features across different machine learning models. · Feature Discovery: Feast enables the exploration and discoverability of features and their associated information. This allows for a deeper understanding of features and their specifications, more feature reuse between teams and projects, and faster experimentation. · Model Training-Serving Consistency: Feast’s standard representations enables feature consistency between model training and serving. This addresses the constant mismatch between the development and production version of machine learning models. · Feature Infrastructure Management: A pretty obvious benefit, Feast abstracts the infrastructure needed to extract, store and manage features across machine learning models. Although conceptually simple, feature extraction is one of those areas that ends up consuming incredibly large amounts of time in machine learning implementations. The Architecture In order to accomplish the aforementioned goals, Feast relies on a very simple architecture that abstracts the feature analysis process in five simple stages: Create: features based on defined format and programming model features based on defined format and programming model Ingest : features via streaming input, import from files or BigQuery tables, and write to an appropriate data store : features via streaming input, import from files or BigQuery tables, and write to an appropriate data store Store: feature data for both serving and training purposes based on feature access patterns feature data for both serving and training purposes based on feature access patterns Access: features for training and serving features for training and serving Discover: information about entities and features stored and served by Feast The core architecture of Feast is illustrated in the following figure: Feast relies on BigQuery as the underlying storage mechanisms for the feature store. In BigQuery, a feature is defined by the following attributes: Entity: A features must be associated with a known Entity which is a domain-specific concept. Examples of Entities can be Customer, Driver or any other relevant domain objects. ValueType: The feature type must be defined, e.g. String, Bytes, Int64, Int32, Float etc. Requirements: Properties related to how a feature should be stored for serving and training Granularity: Time series features require a defined granularity StorageType: For both serving and training a storage type must be defined Those basic attributes are enough to represent features in a way that can be used across different machine learning models. From the architecture standpoint, Feast is based on four fundamental components: · Feast Core: The Core subsystem is responsible for managing the different components of Feast. For instance, Feast Core manages the execution of feature ingestion jobs from batch and streaming sources while also enabling the registration and management of entities, features, data stores, and other system resources. · Feast Store: Feast supports two fundamental types of stores: warehouses and serving. Feast Warehouse Stores are based on Google BigQuery and maintain all historical feature data. The warehouse can be queried for batch datasets which are then used for model training. Serving Stores are responsible for maintaining feature values for access in a production serving environment. · Feast Serving API: This API is responsible for the retrieval of feature values by models in production. Feast Serving API supports HTTP and gRPC models which allows for low latency and high throughput execution models. · Feast Client Libraries: Feast supports client libraries for different languages such as Java, Go and Python as well as a command-line module. The client libraries streamline the developer interactions with the platform. There are different ways to get started with Feast but one of the most creative ones is via Kubeflow. In just a few months, Kubeflow has become one of the most popular runtimes for the execution of machine learning workflows. Conceptually, Kubeflow iss an open source Kubernetes-native platform for developing, orchestrating, deploying, and running scalable and portable ML workloads. It helps support reproducibility and collaboration in ML workflow lifecycles, allowing you to manage end-to-end orchestration of ML pipelines. Feast provides native integration with Kubeflow which streamline its adoption in machine learning environments. As someone who is constantly exposed to real world machine learning solutions and have experienced the challenges of doing feature management at scale, I am incredibly excited about efforts like Feast. The commitment from Google can definitely help with the adoption of the platform. As machine learning evolves, we are likely to see more efforts like Feast that try to abstract the fundamentals for feature extraction and discovery in machine learning solutions.
https://medium.com/dataseries/feast-is-a-simple-open-source-feature-store-that-every-data-scientist-should-know-about-c79a2fe97481
['Jesus Rodriguez']
2020-12-22 11:51:41.617000+00:00
['Machine Learning', 'Data Science', 'Artificial Intelligence', 'Thesequence', 'Deep Learning']
Big trends shaping Fintech
Big trends shaping Fintech Innovations in Financial services & technology For those of you who are not familiar with FinTech, it is the amalgamation of Financial services with technology. It is difficult to predict how the broader FinTech space is going to change in the coming months & years but there are a few big trends that we can look at to see where it is headed. Rise of TechFin Perhaps one of the biggest game-changers has been the rise of TechFin. Startups have been playing a key role in changing the financial landscape dramatically. BATs (Baidu, Alibaba & Tencent) of China & the GAFAs (Google, Apple, Facebook & Amazon) of America have been instrumental in this transition. WeChat, the messaging app by the tech giant Tencent has more than 1 billion users globally which was achieved in a mere 7 years! WeChat Pay — the payment tool for the same app has more than 800 million users in over 25 countries. The reach of these platforms with the huge customer base is incredible; with the tech infrastructure to back themselves and the credibility, they enjoy with their client base. With all these factors in place, these technology companies are all set to revolutionize financial services as we know it. Voice User Interface For the past few years, most of the financial institutions were focused on delivering their services to your smartphones via mobile-friendly apps. It was a step in the right direction with a customer base of smartphones growing into billions. But more recently, we are seeing the new craze of voice as a user interface. The most synonymous in this regard is Apple’s Siri which has been in use since 2011, followed by Microsoft’s Cortana and Amazon’s Alexa & Google’s Home — all of which have recently gained a lot of acceptance. The original voice digital assistants, which were programmed to handle simple requests like playing a song, telling you the weather or updating you on the traffic, have now transformed into handling more complex requests — case in point, Amazon’s recent launch of 14 new devices which include voice-enabled microwave & clock. These voice assistants are becoming intelligent systems and an integral part of our daily lives. Right now, the financial institutions use voice-enabled telephone banking which is very basic in nature. It will be interesting to see how financial institutions incorporate these intelligent voice-enabled services to their lineup as demand for grows among their millennial users. Usage of Big Data As the data becomes the new oil (or new gold), Financial institutions & Technology companies are most strategically placed to leverage this new precious commodity into a monetizing asset for their customers. In the past, data has been used by financial services companies not to optimize the solutions presented to the clients, but more as a by-product of the whole process. The emergence of “big techs” has caused the disruption of this model & shown the financial institutions how to use the data as a core asset rather than a collection of useless info tucked away in files. Most of the free services that we are all too happy to access from companies like Facebook or Google have been offered to us in exchange for our data, which these companies use to generate profit for themselves. Financial companies are beginning to explore how they can use this monetizing model to offer similar services to their clients as well. Of course, there are regulatory concerns associated with this, which are actively being explored by the financial authorities, on how customers can actually be given the right to grant access to companies to their personal data to be subsequently monetized. Artificial Intelligence disruption AI is perhaps the single most important factor which is going to change many facets of the new digitized ecosystem of the world, especially the financial services sector. It has the potential to completely change the way we live our lives. This has also given rise to an ongoing debate on the legal, ethical & moral questions arising from the usage of AI. Let’s say for example 5 Chatbots replace 5 individuals in an organization. What about the people who lost their jobs? Who's to blame if Chat-bots make a mistake — the organization, the supervisor, the chat-bot manufacturer or the organization? These are some of the issues that have not been addressed yet & will perhaps provide the biggest challenge in the future in the incorporation of this technology. But one thing is for sure Artificial Intelligence is here to stay. Cryptocurrencies Digital assets — part of the whole Blockchain movement has rapidly become part of our daily financial lives more than anything else in the past few years. The exponential rise of these financial innovations & people’s spiking interest in them has put pressure on the financial institutions to explore their use in their dealings. The digital coins have already shown immense potential in solving the long-standing issues of financial inclusion, especially in the third world & developing countries. Cryptocurrencies have also acted as an inflation hedge in countries with inflation problems despite their volatility thus showing that fiat currencies don’t have to be the only choice. And there are numerous other use cases where Cryptos continues to outshine their predecessors. The financial institutions are in a conundrum with the authorities not willing to give up the fiat-based system just yet, but the general public eager to move on to the new decentralized, efficient & private Cryptos. It’s only a matter of time before we see an adoption of these digital assets on a mass scale. All these game-changing trends seem set to define & shape the FinTech ecosystem in a fundamental way of going forward. Related Articles: Cross-border Payment systems: SWIFT, RippleNet or BWW?, RegTech 2.0, Big Data in Financial Services, TechFin vs. FinTech — What’s the difference? Stay in touch: Twitter | StockTwits | LinkedIn | Telegram| Tradealike
https://medium.com/technicity/big-trends-shaping-fintech-a3e0c55233f0
['Faisal Khan']
2019-10-14 17:20:18.745000+00:00
['Finance', 'Fintech', 'Cryptocurrency', 'Economics', 'Big Data']
9 Companies That Use Rust in Production
9 Companies That Use Rust in Production Who uses Rust, and what are the benefits of choosing this programming language for your stack? Photo by Kevin Ku on Unsplash. If you haven’t yet heard, Rust is one of the most promising and most loved programming languages out there. First created at Mozilla, it has since been adopted by companies like Dropbox, Microsoft, Facebook, and others. Rust’s main benefit is that it enables C-like performance while still keeping the memory safety that we are used to when developing with languages like JavaScript and Python. In this article, I will look at nine large companies that use Rust and delve into the reasons for their choice. 9 Rust success stories Dropbox Dropbox uses Rust for parts of its file synchronization engine. Since the engine is highly concurrent, writing, testing, and debugging it is hard. Therefore, the team chose to rewrite it in Rust. Rust’s static types and heavy compile-time checks give it an advantage over dynamically typed languages like Python when you need to tackle complex codebases and concurrent code. Rust has been a force multiplier for our team, and betting on Rust was one of the best decisions we made. More than performance, its ergonomics and focus on correctness has helped us tame sync’s complexity. We can encode complex invariants about our system in the type system and have the compiler check them for us. (Source) Read more about Dropbox’s use of Rust on their tech blog. Coursera Coursera uses Rust for their programming assignments feature where students need to write and run a computer program to solve a problem. The programs are run, tested, and graded inside Docker containers. For security reasons, the developer team needed to use a low-level language like Rust for some of the code, and they decided that Rust is more secure than C. Although C is the default low-level full-control programming language, these binaries have strict security and correctness requirements. We instead have chosen Rust, a modern native language from Mozilla. One of Rust’s common selling points is complete immunity to certain classes of security vulnerabilities thanks to its powerful type system, making it an excellent choice for security critical functions. (Source) You can get more details on their use of Rust for programming assignments on their blog. Figma Figma is a collaborative web-based design tool for vector graphics and interface prototyping. They chose to rewrite their multiplayer syncing engine in Rust (previously, it was in TypeScript) to improve performance since their server couldn’t keep up with user growth. We chose Rust for this rewrite because it combines best-in-class speed with low resource usage while still offering the safety of standard server languages. Low resource usage was particularly important to us because some of the performance issues with the old server were caused by the garbage collector. (Source) Find out more about their use of Rust in this article on Rust in production at Figma. npm npm is a package manager for JavaScript. Its engineering team chose to rewrite their main service in Rust because they saw that the service’s performance would soon be a bottleneck if user growth kept up. They rejected technologies such as C and C++ since they didn’t trust themselves to be able to handle memory management for a web-exposed service. Java was rejected since it would involve deploying JVM on their servers. 🙃 The challenges that npm faces demand efficient and scalable solutions. When a service can be deploy-and-forget, that saves valuable operations time and lets them focus on other issues. npm employees also value having a helpful community around any technology they use. Rust fits all these criteria and is currently in use as part of npm’s stack. (Source) To learn more, read their case study on Rust’s homepage. Microsoft Microsoft has recently been experimenting with integrating Rust into its large C/C++ codebases. The main argument for adopting Rust at Microsoft was the memory safety that Rust provides. For the last 12 years, around 70 percent of the CVEs (Common Vulnerabilities and Exposures) discovered at Microsoft have been connected with memory safety. Microsoft has tried various options to solve this issue, such as extensive developer training and static analysis tools. However, it seems like the only way out is to make these vulnerabilities impossible to do. For more info on Rust at Microsoft, watch this talk: Cloudflare Cloudflare uses Rust in their core edge logic and as a replacement for C, which is memory-unsafe. Their GitHub shows 18 open-source repositories that use Rust, and on their blog, they document using it for Firewall Rules, a very customizable firewall tool. With a mixed set of requirements of performance, memory safety, low memory use, and the capability to be part of other products that we’re working on like Spectrum, Rust stood out as the strongest option. (Source) Facebook Facebook used Rust to rewrite its source control backend, which was written in Python. They were looking for a compiled language to rewrite it in and were attracted to Rust because of its safety benefits. Since then, Rust has been adopted by the source control team. As the reasons for adoption, they mention the huge cost of bugs for Facebook and the ease of the compiler feedback loop, in contrast to static analysis and code reviews. Rust detects large classes of serious bugs at compile time. The cost of a bug at compile time is orders of magnitude less than in production. Amazon AWS has used Rust for performance-sensitive components of services like Lambda, EC2, and S3. In addition, the company openly supports and sponsors the development of the language and its ecosystem. Amazon also has open-sourced a service written entirely in Rust. Firecracker VMM is a virtual machine monitor that was built for services like AWS Lambda and AWS Fargate. Discord Discord uses Rust in multiple places of their codebase, both on the client- and the server-side. For example, the team used Rust and Elixir to scale to 11 million concurrent users through the use of Elixir NIFs (Native Implemented Functions). In this case, Rust enabled them to speed up their existing Elixir codebase while keeping everything memory safe. They have also rewritten their Read States service in Rust (originally in Go). While the Go version of the service was fast enough most of the time, it sometimes had large latency spikes due to Go’s memory model and garbage collector. To solve that, Discord switched to Rust, which offers a unique memory allocation system that makes garbage collection unnecessary. Along with performance, Rust has many advantages for an engineering team. For example, its type safety and borrow checker make it very easy to refactor code as product requirements change or new learnings about the language are discovered. Also, the ecosystem and tooling are excellent and have a significant amount of momentum behind them. (Source) To read more about their use of Rust, check out this article on their blog. Future of Rust In most of these companies, Rust functions as a strictly better alternative for C — you can see a visible pattern of rewrites done in Rust to escape performance degradation. Teams reach for it when they need extra performance but want to avoid memory issues associated with C. But Rust has far more benefits: it makes lower-level programming more accessible, has excellent support for WASM, and is fantastic for concurrency. And I’m not even going to start to speak about the community. ❤️ In the future, expect Rust usage to increase as more and more companies discover how it can improve their codebases. If you would like to learn more about Rust, I have compiled a quick introduction that you can check out on our blog. In the meantime, follow Serokell on social media like Twitter and Medium to see more posts about Rust and multiple other programming languages that we use in our daily work.
https://medium.com/dev-genius/9-companies-that-use-rust-in-production-9b8f6634b7b4
[]
2020-11-19 21:38:46.759000+00:00
['Rust', 'Software Development', 'Software Engineering', 'Programming', 'Programming Languages']
Using Tableau with DynamoDB: How to Build a Real-Time SQL Dashboard on NoSQL Data
In this blog, we examine DynamoDB reporting and analytics, which can be challenging given the lack of SQL and the difficulty running analytical queries in DynamoDB. We will demonstrate how you can build an interactive dashboard with Tableau, using SQL on data from DynamoDB, in a series of easy steps, with no ETL involved. DynamoDB is a widely popular transactional primary data store. It is built to handle unstructured data models and massive scales. DynamoDB is often used for organization’s most critical business data, and as such there is value in being able to visualize and dig deeper into this data. Tableau, also widely popular, is a tool for building live, interactive charts and dashboards. In this blog post, we will walk through an example of using Tableau to visualize data in DynamoDB. DynamoDB works well out-of-the-box for simple lookups by the primary key. For lookups by a different attribute, DynamoDB allows creating a local or global secondary index. However, for even more complex access patterns like filtering on nested or multiple fields, sorting, and aggregations-types of queries that commonly power dashboards-DynamoDB alone is not sufficient. This blog post evaluates a few approaches to bridge this gap. In this post, we will create an example business dashboard in Tableau on data in DynamoDB, using Rockset as the SQL intelligence layer in between, and JDBC to connect Tableau and Rockset. The Data For this example, I’ve combined sample data from Airbnb and mock data from Mockaroo to generate realistic records of users with listings, bookings, and reviews for a hypothetical home rental marketplace. (All names and emails are fake.) The mock data and scripts are available on Github. The data model is typical for a DynamoDB use case-here’s an example item: A few things to note: In our data, sometimes the review field will be missing (if the user did not leave a review). field will be missing (if the user did not leave a review). The bookings and listings arrays may be empty, or arbitrarily long! and arrays may be empty, or arbitrarily long! The user field is denormalized and duplicated within a booking, but also exists separately as its own item. We start with a DynamoDB table called rental_data loaded with 21,964 such records: Connecting Tableau to DynamoDB Let’s see this data into Tableau! We’ll need accounts for Tableau Desktop and Rockset. I also assume we’ve already set up credentials to access our DynamoDB table. First, we need to download the Rockset JDBC driver from Maven and place it in ~/Library/Tableau/Drivers for Mac or C:\Program Files\Tableau\Drivers for Windows. Next, let’s create an API key in Rockset that Tableau will use for authenticating requests: In Tableau, we connect to Rockset by choosing “Other Databases (JDBC)” and filling the fields, with our API key as the password: Finally, back in Rockset, we just create a new collection directly from the DynamoDB table: We see the new collection reflected as a table in Tableau: Users Table Our DynamoDB table has some fields of type Map and List, whereas Tableau expects a relational model where it can do joins on flat tables. To resolve this, we’ll compose SQL queries in the Rockset Console that reshapes the data as desired, and add these as custom SQL data sources in Tableau. First, let’s just get a list of all the users on our rental platform: In Tableau, we drag “New Custom SQL” to the top section, paste this query (without the LIMIT clause), and rename the result to Users: Looks good! Now, let’s repeat this process to also pull out listings and bookings into their own tables. Listings Table Note that in the original table, each row (user) has an array of listing items. We want to pull out these arrays and concatenate them such that each item itself becomes a row. To do so, we can use the UNNEST function: Now, let’s select the fields we want to have in our listings table: And we paste this as custom SQL in Tableau to get our Listings table: Bookings Table Let’s create one more data source for our Bookings table with another UNNEST query: Chart 1: Listings Overview Let’s get a high level view of the listings around the world on our platform. With a few drag-and-drops, we use the city/country to place the listings on a map, sized by booking count and colored by cancellation policy. Looks like we have a lot of listings in Europe, South America, and East Asia. Chart 2: Listings Leaderboard Let’s try to find out more about the listings pulling in the most revenue. We’ll build a leaderboard with the following information: labeled by listing ID and email of host and total revenue as the sum of cost across all bookings (sorted from highest to lowest) as the sum of cost across all bookings (sorted from highest to lowest) colored by year it was listed it was listed details about title, description, and number of beds shown on hover Note that to accomplish this, we have to combine information across all three of our tables, but we can do so directly in Tableau. Chart 3: Rating by Length Next, suppose we want to know what kind of users our platform is pleasing the most. Let’s look at the average rating for each of the different lengths of bookings. User Dashboard on Real-Time Data Let’s throw all these charts together in a dashboard: You may notice the ratings by length are roughly the same between length of stay-and that’s because the mock data was generated for each length from the same rating distribution! To illustrate that this dashboard gets updated in real time on the live DynamoDB source, we’ll add one record to try and noticeably skew some of the charts. Let’s say I decide to sign up for this platform and list my own bedroom in San Francisco, listed for $44 a night. Then, I book my own room 444 times and give it a rating of 4 each time. This Python code snippet generates that record and adds it to DynamoDB: Sure enough, we just have to refresh our dashboard in Tableau and we can see the difference immediately! Summary In this blog post, we walked through creating an interactive dashboard in Tableau that monitors core business data stored in DynamoDB. We used Rockset as the SQL intelligence layer between DynamoDB and Tableau. The steps we followed were:
https://medium.com/rocksetcloud/using-tableau-with-dynamodb-how-to-build-a-real-time-sql-dashboard-on-nosql-data-cfbc0ff4f2f0
['Vahid Fazel-Rezai']
2019-09-18 21:52:56.759000+00:00
['Dashboard', 'Tableau', 'Sql', 'Dynamodb', 'Real Time Analytics']
Algorithms & Data Structures Series
Algorithms & Data Structures Series Inspired by Algorithms, Part I course from Princeton University. Algorithms, Part I course from Princeton University — coursera This course covers the essential information that every serious programmer needs to know about algorithms and data structures, with emphasis on applications and scientific performance analysis of Java implementations . Part I covers elementary data structures, sorting, and searching algorithms. — coursera. One last note to mention, I’m by no means an expert and I’m still learning, and part of learning process is sharing and trying to communicate what you’ve learnt so far. The list of articles in the series are as the following: The articles starts with “ — ” aren’t main topics but they support the overall understanding of other articles in the course. It’s not recommended to skip any of them while following along. It’s worth to mention that these articles will be kept updated when there is a bug, misspelling, grammar mistake, or any other issue.
https://medium.com/omarelgabrys-blog/algorithms-data-structures-series-85ec94eb8aff
['Omar Elgabry']
2017-03-26 11:13:26.902000+00:00
['Programming', 'Coding', 'Algorithms', 'Data Structures', 'Java']
Remote Jobs, Freelancers and Corona (Covid-19)
Today, human life is going through a very important turning point in history. Corona (Covid-19) makes very strong impact on our life style and financial status. Financial conditions have deteriorated to the point where most developed countries are struggling to survive. But when it comes to freelancing and remote jobs, the future looks bright. Due to Covid-19 majority of people are almost confined to their houses and now we have to make many decisions in the context of Corona. A great example of this is Twitter allowing its employees to work in their homes for indefinitely and this historical decision seems to be the permanent future of the industry. In current scenario full time office employee is very expensive for any company or organization. And the kind of financial deficit world is facing has led to the growth of remote jobs and freelancing industry. Online shopping and the availability of everything at home has also become a habit and a compulsion of the people. Compared to before and after Corona, many of our activities are limited to the Internet. The majority of businesses who have no existence on internet are trying to make their identity online because the business of the future is in the world of internet. Now you can understand how many job opportunities are coming for freelancers in coming days. Just update and enhance your skills to earn more.
https://medium.com/nyc-design/remote-jobs-freelancers-and-corona-covid-19-e2acedd36df3
['Salman Habib']
2020-07-21 16:14:15.232000+00:00
['Freelancing', 'Covid 19', 'Remote Job', 'Remote Working', 'New York']
“Prefer” Postmortem: 6 Lessons Learned From A Network Of Independent Professionals
I’ve received over a dozen emails and calls in the past year asking about lessons learned from Prefer, a team I helped gather and advise that ultimately shut down. Among all the ideas and new teams I’ve encountered, Prefer is still one of those products that I believe needs to exist. I get excited when I come across other entrepreneurs seeking to solve this problem so I figured I’d share some take-aways for the benefit of others exploring the space. Prefer started as an idea I shared with a few other entrepreneurs I respect, just as I started my stint as a full-time VC and before I returned to building products/teams at Adobe. The original idea was based on a few insights about customer preferences for services and the growing economy of independent service providers — from massage therapists and personal trainers to chefs, tutors, and accountants (aka “Soloists”), specifically: Customers prefer a trusted referral from friends over four stars from strangers any day of the week. Soloists (independent service professionals of all kinds) routinely get their best new clients via referrals from existing clients, and prefer referrals over paying for leads, placing ads, and gaming search engines. By 2020, the Freelancer’s Union estimates that over 40% of the American workforce will be independent service professionals. Given lack of access to “back office” tools for payments, scheduling, insurance, and a steady stream of new clients, the independent services provided by Soloists are becoming commoditized by on-demand networks, “middle-men” headhunters, and the traditional firm/agency structure. In the age of apps, marketplaces, and professional networks, we can do better. The original idea was to build a “Consumer App” for clients to find and book services they need via referrals from friends, while onboarding and referring their favorite Soloists to their network of friends. And for Soloists, the team built a “Provider App” to get a steady stream of new clients and gradually grow and manage their business. We imagined Prefer becoming a new superpower for anyone to find any service provider as a trusted referral from friends, and for Soloists to vastly expand their business in the very best way possible (via referrals). Prefer was able to assemble a group of stellar engineers, designers, and operators I would certainly work with again, and the product went through at least three major pivots and many more evolutions (perhaps too many and too quickly in retrospect) and endured some critical learnings that may have happened too late. As I reflect on the experience and debriefed with a few members of the team, here is a list of lessons learned based on what worked and didn’t work in the journey to empower the careers of independent service professionals and enable customers to find the service professionals they need from their friends. 6 Lessons Learned From Prefer Marketplace Building: Vertical vs. Horizontal Traditional marketplace building starts with liquidity. Can you gather enough “supply” and “demand” to ensure that buyers and sellers are able to find each other and be satisfied with little effort. For this reason, most marketplaces start with a narrow (vertical) focus on specific services, whether it be massage therapists, drivers, or babysitters. If you can build enough supply while also pacing the demand — all while keeping the cost of acquiring both sides in check with some form of viral loop — you can achieve the elusive “flywheel” that has unleashed some of the greatest marketplaces of our time. But for a referral marketplace like Prefer that would require Soloists to bring their own clients onto the platform (and for clients to keep coming back for new services), the team knew that we needed to take a horizontal approach instead. After all, why would massage therapists introduce a network of other massage therapists to their own clients? So we needed to accommodate many forms of Soloists from day one. As the saying goes, “when you try to do something for everyone, you don’t get it right for any one.” When it came to tools for scheduling, booking, etc, the lowest common denominator was a common trap to avoid. Should we have avoided all of these tools out of the gate and focus simply on discovery? Or should we have forced ourselves to focus on a more limited set of verticals, from both a product and go-to-market perspective? Don’t Go Against The Grain Of Communication Preferences For most on-the-go Soloists with varying levels of technology savvy, with clients of all ages, the easiest way to coordinate new appointments and build relationships is text message. It was always a challenge to reconcile what happens in-the-app vs. via text message. When building a network requires unnatural motions (communicating in a new way, paying in a new way, scheduling in a new way, etc), you’re in an uphill battle for adoption in an already uphill battle for awareness! Lesson reinforced: You can only force a new behavior for network participants when it is 10x better. Perhaps we should have doubled-down on discovery and one element of the relationship at the start, and eliminated any in-app messaging? Or perhaps the lack of calendar-sync and other functionalities at the start rendered the in-app messaging inconvenient at best. I believe we failed to make the inner-workings of the app itself 10x better than the normal ways a Soloist-Client relationship was managed. One important thing to note: when a Soloist and his or her client DID connect on the app and start working together, they often continued to do so. They also started to invite others to the platform on both sides — and at one point these “working pairs” exhibited very promising growth potential for the network. But there weren’t enough pairs getting to this point without a ton of non-scalable hand-holding. When it came to the “productivity tools” of the app itself, it was clear they were not a driver to adoption, but could potentially lead to retention/engagement over time. “Wow moments” must have frequency to have gravity. Great consumer products bring a superpower or elusive “wow” moment to customers with some degree of frequency. In Prefer’s case the two “wow” moments happened when (1) a customer was looking for a new service professional and instantly found that professional through a recommendation from a friend, and (2) “Soloists” on the platform got a trusted referral from a new client automagically. Unfortunately, the acute need for the consumer was not frequent and happened probably once a quarter or once a month at most (think about it, how many times per year do you look for a NEW service professional?). As a result, there wasn’t enough urgency for a consumer to download the app and go through the work of adding/referring their service providers to their friends…and when they did, there wasn’t enough frequency to come back to the app. This directly inhibited the frequency of “wow” moments and the flywheel of growth required for the first version of Prefer to work. You can’t obsess enough about the top-of-the-funnel. The more time the Prefer team spent with different Soloists and Clients, the more we realized how many different value propositions there were for first-time users. We learned that some clients just wanted the ability to discover services, some Soloists only cared about referrals while others only sought a better way to serve existing clients, while other pairs of Soloists and Clients just valued things like easy billing and automatic notifications for appointments. With such variance of preferences, the “first mile” of the customer experience either didn’t evolve fast enough or failed to serve a broad enough set of users. This experience emboldened my observation that most product teams only spend the final mile of their experience building the product thinking about (and testing) the first mile of the customer’s experience using the product. It should often be the opposite. The thirst for community and collaboration among Soloists is real. While there were many things we failed to figure out, the team nailed some things incredibly well. One of these things was the member-led community we called the “Soloist Collective” which is still growing organically despite the product’s demise. Among independent professionals, there is a need for peer support and best practice exchange, informal referral networks, and periodic gatherings. One of Prefer’s slogans was “you may work on your own, but you’re not alone.” Indeed, being a Soloist can be lonely and there is a lot of inefficiency and frustration in figuring out everything on your own. The world needs more structured communities for these professionals, and we encountered great demand on this front. Too many iterations too quickly cause a team fatigue that carries over. For all the talk of “pivots” in the world of start-ups, the truth is that a new version of a product by the same team is not really a “fresh start.” While a pivot can capture the benefits of starting with a great culture and team of experts from day one, you undeniably carry an extra dose of doubt and fatigue from the previous iteration. The Prefer team had a high bar for the growth levers needed to continue along a path. As a result, there were at least three pivots (major re-directions of the company), and there was noticeably less energy and patience after the third. I’ve always believed that one of the greatest competitive advantages for a start-up is simply sticking together long enough to figure it out. And, when you have to throw away all your code and design and plans multiple times, there is a wear and tear that is hard to overcome — even among the best teams out there. In Closing… There were many other lessons learned, and the leadership team would be in a far better position than I to share those details. But these are the initial thoughts I would share with anyone considering a new product in this space. At scale, having access to the trusted professionals your friends recommend would be a game-changer, but the engagement and network density obstacles need to be solved first. Much like LinkedIn needed the contact importer and Facebook needed the “friends you may know” section, there is a jumpstart innovation yet to be found. I have no doubt, with the right team and level of patience and determination to nail the product, there is an incredible problem to be solved. Especially at a time when so much of labor is being commoditized or automated, relationship-driven services will become more crucial to our economy and everyday lives. Are you an entrepreneur with passion for this problem, a deep care for design and crafting product experiences, and appreciation for the “ground game” operations to make this work? If so, hit me up. Happy to share more of what I observed and learned from Prefer round one.
https://medium.com/positiveslope/prefer-postmortem-6-lessons-learned-from-building-an-independent-pro-network-17f398da002
['Scott Belsky']
2019-07-10 02:29:36.681000+00:00
['Startup']
The Reality of Working as a Remote Project Manager
From StartupGuide LinkedIn In August 2020 year I was approached by the Startup Guide to Project Manage their upcoming Startup Guide Germany Project, which is a larger version of what is usually a localised approach to cover cities. This one contains 5 different regions and shifts the typical startup scene focus to cover more of the social impact and sustainability sector. At Startup Guide, we feature 6 different types of organisations: Startup, Space, Program, School, Investors & Founders. By categorising the companies, Project Managers can easily distinguish the requirements needed for each feature, as some are shorter and others longer. As Startup Guide is in media and publishing, my duties fell into content production management: Cold B2B Email Outreaches — according to the category’s requirements. Scheduling and management of bringing together journalists and the companies themselves — to produce accurate and unique articles. Management of image collection to accompany the interview. Management & collection of image rights. Ensuring timeline deliveries and deadline on the sides of both the journalists & companies being featured. All in all, I was contracted to work 4 months and ended up with 100+ companies featured. It was hectic and draining, as most of it was based on communication through email and chasing down leads in order to ensure the timely delivery of content before the deadlines I set. In order to stay on-top of things, I needed to track the status of each company through a project management system to make sense of it all. In my case: monday.com (seen left). But of course I did not work on the entire book by myself. Whereas I had the responsibility to manage the entirety of the content production by myself, I was working directly under my boss, who’s role is the Global Production Lead in the Startup Guide. Our work together required a great deal of collaboration, as content related matters such as determining the scope of an article a journalist has to write is hardly something I can call the shots on as a contracted consultant. Through this experience I’ve learnt that Project Management can take a variety of formats… but here it has everything to do with managing client — stakeholder relations and expectations, as my job was not only to be a friendly front towards the organisations & journalists without getting irritated or frustrated by very odd actions or the requests clients made. But it was also work in classical Sales, where I had to convince the client to participate, even though they might not have seen the relevance to their organisation (yet). This especially occurred during email follow-up conversations and of course many many Cold Calls I made towards the end of the books content production deadline on 30.11.2020. A completely new area for me, as I had to learn how to approach various different characters, who had no idea who I was & then sell them a feature in the Startup Guide Germany book. As a controlled introvert myself, this process was definitely challenging, but also very rewarding — once initial hostility ceased and excitement took its place.
https://medium.com/swlh/the-reality-of-working-as-a-remote-project-manager-cc03588cd09c
['Julian Paul']
2020-12-08 13:43:32.730000+00:00
['Project Management', 'Structure', 'Personal Development', 'Remote Work', 'Startup']
Machine Learning is Becoming a Joke — Is AutoML Good For Data Scientists?
There was a time in the recent past when data scientists used to build ML algorithms from scratch. Before these new, fancy, and “sophisticated” libraries took over, true data scientists were aware of the underlying mechanics of the algorithms. They really knew about the models they run and the mathematics they used. Today, everyone is a data scientist and it disintegrates the significance of this profession. With the help of highly abstract libraries like Keras and Scikit-Learn, a newbie can write the code without actually knowing what’s happening beneath. And the corporate giants like Amazon, Google, and Microsoft are working on making it even more abstract and simpler. With their AI and cloud technology, they are giving birth to a new field called AutoML, where they’re hiding the underlying methodologies even better. But, Is The Simpler The Better? Some would argue that losing touch with mathematics will not allow any room for creativity and innovation. And for the most part, they’ll be right. As we start to build our ML models using highly abstract frameworks, we have lesser control over them. Just like in the case of programming languages, everybody knows, Python might be simpler but has lesser control over the machine in comparison to C and C++. Hence, providing lesser or no room for improvement; the same old pattern repeats again and again. Building models with a GUI environment is not a good idea. Not yet. Because of these new AutoML tools, so-called data scientists will never develop the intuition required to solve a problem; at least not if a problem lies at the microscopic level.
https://medium.com/datadriveninvestor/machine-learning-is-becoming-a-joke-automl-downsides-c7634ce0572c
['Nishu Jain']
2020-12-03 16:09:24.169000+00:00
['Machine Learning', 'Automl', 'Artificial Intelligence', 'Career', 'Data Science']
Comprendiendo los datos
Head of Decision Intelligence, Google. Hello (multilingual) world! This account is for translated versions of my English language articles. twitter.com/quaesita Follow
https://medium.com/datos-y-ciencia/comprendiendo-los-datos-a375769ef300
['Cassie Kozyrkov']
2020-02-27 01:30:57.651000+00:00
['Artificial Intelligence', 'Data Science', 'Technology', 'Analytics']
Building a Better Profanity Detection Library with scikit-learn
Having already ruled out 3 libraries, I put my hopes on the 4th and final one: profanity-filter . profanity-filter profanity-filter uses Machine Learning! Sweet! Turns out, it’s really slow. Here’s a benchmark I ran in December 2018 comparing (1) profanity-filter , (2) my library profanity-check , and (3) profanity (the one with the list of 32 words): A human could probably do this faster than profanity-filter can I needed to be able to perform many predictions in real time, and profanity-filter was not even close to being fast enough. But hey, maybe this is a classic tradeoff of accuracy for speed, right? Nope. At least profanity-filter is not dead last this time None of the libraries I’d found on PyPI met my needs, so I built my own. Building profanity-check, Part 1: Data I knew that I wanted profanity-check to base its classifications on data to avoid being subjective (read: to be able to say I used Machine Learning). I put together a combined dataset from two publicly-available sources: the “Twitter” dataset from t-davidson/hate-speech-and-offensive-language, which contains tweets scraped from Twitter. the “Wikipedia” dataset from this Kaggle competition published by Alphabet’s Conversation AI team, which contains comments from Wikipedia’s talk page edits. Each of these datasets contains text samples hand-labeled by humans through crowdsourcing sites like Figure Eight. Here’s what my dataset ended up looking like: Combined = Tweets + Wikipedia The Twitter dataset has a column named class that’s 0 if the tweet contains hate speech, 1 if it contains offensive language, and 2 if it contains neither. I classified any tweet with a class of 2 as “Not Offensive” and all other tweets as “Offensive.” The Wikipedia dataset has several binary columns (e.g. toxic or threat ) that represent whether or not that text contains that type of toxicity. I classified any text that contained any of the types of toxicity as “Offensive” and all other texts as “Not Offensive.” Building profanity-check, Part 2: Training Now armed with a cleaned, combined dataset (which you can download here), I was ready to train the model! I’m skipping over how I cleaned the dataset because, honestly, it’s pretty boring— if you’re interested in learning more about preprocessing text datasets check out this or this. Are you also surprised the code is so short? Apparently scikit-learn does everything. Two major steps are happening here: (1) vectorization and (2) training. Vectorization: Bag of Words I used scikit-learn 's CountVectorizer class, which basically turns any text string into a vector by counting how many times each given word appears. This is known as a Bag of Words (BOW) representation. For example, if the only words in the English language were the , cat , sat , and hat , a possible vectorization of the sentence the cat sat in the hat might be: “the cat sat in the hat” -> [2, 1, 1, 1, 1] The ??? represents any unknown word, which for this sentence is in . Any sentence can be represented in this way as counts of the , cat , sat , hat , and ??? ! A handy reference table for the next time you need to vectorize “cat cat cat cat cat” Of course, there are far more words in the English language, so in the code above I use the fit_transform() method, which does 2 things: Fit: learns a vocabulary by looking at all words that appear in the dataset. learns a vocabulary by looking at all words that appear in the dataset. Transform: turns each text string in the dataset into its vector form. Training: Linear SVM The model I decided to use was a Linear Support Vector Machine (SVM), which is implemented by scikit-learn 's LinearSVC class. This and this are good introductions if you don’t know what SVMs are. The CalibratedClassifierCV in the code above exists as a wrapper to give me the predict_proba() method, which returns a probability for each class instead of just a classification. You can pretty much just ignore it if that last sentence made no sense to you, though. Here’s one (simplified) way you could think about why the Linear SVM works: during the training process, the model learns which words are “bad” and how “bad” they are because those words appear more often in offensive texts. It’s as if the training process is picking out the “bad” words for me, which is much better than using a wordlist I write myself! A Linear SVM combines the best aspects of the other profanity detection libraries I found: it’s fast enough to run in real-time yet robust enough to handle many different kinds of profanity. Caveats That being said, profanity-check is far from perfect. Let me be clear: take predictions from profanity-check with a grain of salt because it makes mistakes. For example, its not good at picking up less common variants of profanities like “f4ck you” or “you b1tch” because they don’t appear often enough in the training data. You’ll never be able to detect all profanity (people will come up with new ways to evade filters), but profanity-check does a good job at finding most. profanity-check profanity-check is open source and available on PyPI! To use it, simply $ pip install profanity-check
https://towardsdatascience.com/building-a-better-profanity-detection-library-with-scikit-learn-3638b2f2c4c2
['Victor Zhou']
2019-12-01 06:19:00.297000+00:00
['Machine Learning', 'Python', 'NLP', 'Support Vector Machine', 'Scikit Learn']
13 Most Common Google Cloud Reference Architectures
13 Most Common Google Cloud Reference Architectures Summary of #13DaysOfGCP architecture Twitter series 👋 Hi Google Cloud Devs!! I am asked multiple times to compile a list of most common Google Cloud reference architectures. That got me thinking and I started #13DaysOfGCP mini series on Twitter. If you were not able to catch it, or if you missed a few days, here I bring to you the summary! Kickoff and your topic 💭 suggestions Thank you to all who provided the topic ideas and shared their thoughts. This has been instrumental in helping me pick the topics that you all were interested in seeing the over the course of 13 days. Kickoff day to grab all your topic suggestions! Thank you for participating :) #1: How to set up hybrid architecture in Google Cloud and on-premises? Day 1: How to set up hybrid architecture in Google Cloud and on-premises? #2: How to mask sensitive data in chatbots using Data loss prevention (DLP) API? Day 2: How to mask sensitive data in chatbots using Data loss prevention (DLP) API? #3: How to build mobile app backends on Google Cloud? Day 3: How to build mobile app backends on Google Cloud? #4: How to migrate Oracle Database to Spanner? Day 4: How to migrate Oracle Database to Spanner? #5: How to setup hybrid architecture for cloud bursting? Day 5: How to setup hybrid architecture for cloud bursting? #6: How to build a data lake in Google Cloud? Day 6: How to build a data lake in Google Cloud? #7: How to host websites on Google Cloud? Day 7: How to host websites on Google Cloud? #8: How to setup CICD pipeline on Google Cloud? Day 8: How to setup CICD pipeline on Google Cloud? #9: How to build serverless microservices in Google Cloud? Day 9: How to build serverless microservices in Google Cloud? #10: Machine Learning on Google Cloud Day 10: Machine Learning on Google Cloud #11: Serverless image, video or text processing in Google Cloud Day 11: Serverless image, video or text processing in Google Cloud #12: Internet of Things (IoT) on Google Cloud Day 12: Internet of Things (IoT) on Google Cloud #13: How to setup BeyondCorp zero trust security model? Day 13: How to setup BeyondCorp zero trust security model? Wrap up with a Puzzle Conclusion and Credits I am thankful and really humbled by your participation, engagement and topic suggestions that made this series so much fun, not just for me but also for the rest of the Google Cloud Community members! Thanks again and 👋 until later 🙂
https://medium.com/google-cloud/13-most-common-google-cloud-reference-architectures-23630b46326d
['Priyanka Vergadia']
2020-05-18 05:28:41.495000+00:00
['Machine Learning', 'Data Science', 'Google Cloud Platform', 'Serverless', 'Cloud Computing']
Doubly Linked Lists With JavaScript
Adding Methods to the Doubly Linked List We will now add methods to our Doubly Linked List class. In this article, we will handle the basic methods for push, pop, shift, and unshift. It is similar to building a Singly Linked List class, except we need to keep in mind that each node has both a next and a previous value. Push (Adding a node to the end) The push method will take a value as an argument and we will first create a new node with that value, which we are saving here as newNode. There is one edge case that we need to check for which is if the list is empty. We can do this by checking if there is no head, or if the length is equal to 0. If so, then we will simply set the head and the tail of the list to the new node. If the list is not empty, we will set the next property on the current tail to the new node, set the previous property on the new node to be the current tail, and set the tail property of the list to be the new node. Finally, we will increase the length of the list by one and return the list. push(val) { const newNode = new Node(val); if (this.length === 0) { this.head = newNode; this.tail = newNode; } else { this.tail.next = newNode; newNode.prev = this.tail; this.tail = newNode; } this.length++; return this; } Pop (Removing a node from the end) In the pop method, we will have two edge cases. First, if the list is empty, we return undefined. Second, if the length of the list is equal to 1, we will remove that node, so we will set the head and tail properties of the list to null. In other cases, we first need to store the current tail in a temporary variable so we do not lose the reference to it. Then, we will set the tail of the list to be equal to the previous node of the current tail. Next, we then need to sever the ties between the new tail and the old tail. This is done by setting the next of the new tail to be null and setting the previous of the old tail (stored as temp) to be null. Finally, we will decrease the length by 1 and return the old tail. pop() { if (this.length === 0) return undefined; const temp = this.tail; if (this.length === 1) { this.head = null; this.tail = null; } else { this.tail = temp.prev; this.tail.next = null; temp.prev = null; } this.length--; return temp; } Shift (Removing a node from the beginning) For a Doubly Linked List, the shift method is very similar to the pop method, except we will be working with the head, rather than the tail. We will again, first check for the two edge cases for length being equal to 0 and equal to 1 and do the exact same thing. Otherwise, we will store the current head in a temporary variable, then set the new head of the list to be the next value of the old head. Then we will sever the ties between the old and new head by setting the previous value of the new head to be null, and setting the next value of the old head (stored as temp) to be null. Finally, we will decrease the length of the list by 1 and return the old head. shift() { if (this.length === 0) return undefined; const temp = this.head; if (this.length === 1) { this.head = null; this.tail = null; } else { this.head = temp.next; this.head.prev = null; temp.next = null; } this.length--; return temp; } Unshift (Adding a node to the beginning) Just like shift was similar to pop, unshift is similar to push, except we will be working with the head rather than the tail. The unshift method takes one argument, and we will create a new node with that value. We will first check for the edge case of an empty list. If so, we will set the head and the tail properties of the list to be the new node. Otherwise, we will set the next value of the new node to be the old head, set the previous value of the old head to be the new node, and set the head of the list to be the new node. Finally, we increase the length by 1 and return the list. unshift(val) { const newNode = new Node(val); if (this.length === 0) { this.head = newNode; this.tail = newNode; } else { newNode.next = this.head; this.head.prev = newNode; this.head = newNode; } this.length++; return this; } The final code should look something like below. In our example, we have the Node class and Doubly Linked List class with our methods for push, pop, shift, and unshift. At the end of the code, we are creating a list and pushing the values 1–5 to the list. From here, you can add or remove nodes easily using the methods we wrote.
https://medium.com/javascript-in-plain-english/doubly-linked-lists-with-javascript-9c20a9dc4fb3
['Chad Murobayashi']
2020-12-01 17:56:26.885000+00:00
['JavaScript', 'Algorithms', 'Software Engineering', 'Data Structures', 'Programming']
13 Octoberish Books
Photo by Rob Potter on Unsplash October is my second favorite month of the year, topped only by the glory that is November. Shorter, darker days, cozy reading by the fireplace, falling leaves, rain, fog, mist…I love everything to do with fall. But when I moved to Southern California, I had to work to get my autumn on. As a result, I started making lists of “Octoberish” books, movies, places, and music — things that would help me escape into a pleasurably melancholy state of mind. Though I enjoy a good scare as much as the next person (I wrote my master’s thesis on the importance of fear in children’s literature), for me, October doesn’t necessarily equate to spooky. It also includes things that evoke yearning, wistfulness, and gloom. The reading list that follows is a compilation. I’ve tried to avoid the obvious — from my beloved Brontës to my hero, Stephen King — since you already know about them. Instead, here are the best of the somewhat lesser-known books that I find perfectly Octoberish. Jonathan Strange & Mr. Norrell, by Susanna Clarke Okay, this was a bestseller, but it’s also one of my favorite books of all time, and it deserves to be more widely read. Two 19th century English magicians compete to see who is the more powerful — and unwittingly stumble upon the secrets of the shadowy John Uskglass, The Raven King. A companion collection of short stories, The Ladies of Grace Adieu, came out a couple of years after Jonathan Strange, which I devoured just as quickly. Night Film, by Marisha Pessl I loved the author’s first novel, Special Topics in Calamity Physics, and when I heard she had another book out, I was afraid that I wouldn’t like it as much. WRONG. Night Film is even better. It’s the story of a reclusive film director, his talented but troubled daughter, and the investigative journalist who pursues their story at the expense of all else. Pessl’s interstitial documentation of the journalist’s story adds to the dark not-quite-realism. House of Leaves, by Mark Z. Danielewski Freaky. Deaky. And way meta. With even more cool interstitial stuff along the lines of Pessl’s book, above. Someone should really write a paper comparing and contrasting this book with Night Film, because the similarities are fascinating. A filmmaker and his family move into a house that’s bigger on the inside than on the outside. What’s so scary about that? Read it and find out. The Little Stranger, by Sarah Waters I liked the recent movie, but the book is far better. It reads like something written at least 70 years ago — and I mean that as the highest of compliments. A country doctor is called to attend what’s left of an aristocratic Warwickshire family brought low by two world wars. Their decaying house — and what’s in it — haunts the family, the doctor, and ultimately, the reader. Long Lankin, by Lindsey Barraclough Forget Neil Gaiman and John Bellairs (well, not really): this is THE scariest book intended for older children that I’ve ever read. Barraclough expertly sustains dread and atmosphere to the very last page. This was one of the books I analyzed in my thesis, and it gets better with every re-reading. The Little Friend, by Donna Tartt Tartt, who wrote the Pulitzer-winning The Goldfinch, debuted with this book. It’s not perfect, but it’s well worth your time. The best approximation I can give you is that this is what would have happened if Shirley Jackson had written To Kill a Mockingbird. Now: TKAM is one of my favorite books, and Jackson is one of my favorite writers, and The Little Friend is not as amazing as all that — but that description should give a sense of its atmosphere. Neverwhere, by Neil Gaiman The title itself is brilliant, and the book is no less fabulous. It tells the tale of two travelers — a businessman named Richard and a damaged street girl called Door — and their adventures in the dangerous world of London Below. This is my favorite of many great books by Neil Gaiman. In the Forest of Forgetting, by Theodora Goss Yet another title that I envy — so evocative, and the book lives up to its promise. This collection of short stories is firmly in the gothic/slipstream tradition and reads like a bunch of the darkest, oldest fairy tales. Magic for Beginners, by Kelly Link Kelly Link’s work — all slipstream short stories — fills me with envy and awe. She’s won a ton of awards, and she’s deserved them all. Read “Stone Animals” or the eponymous short story and the rest of this collection. Then read Link’s other books, Stranger Things Happen and Pretty Monsters. Her tales are addictive and unlike anything else I’ve ever read. Orlando: A Biography, by Virginia Woolf This classic novel of the gender-shifting, seemingly ageless and immortal Orlando is a fascinating, dreamlike trip. What could be more Octoberish than living for centuries and watching those around you age and die? Woolf’s prose is gorgeous, and her moody spirit can’t help but leak through. Wylding Hall, by Elizabeth Hand I’m telling you: Elizabeth Hand looked inside my head and then whipped up the ideal short novel for me. To get away from fans and other distractions, a folk rock band rents an old English manor in order to record an overdue album. Trouble is, the manor is haunted. Any fan of 1970s British rock will find Easter eggs aplenty in this fascinating tale. The Widow’s House, by Carol Goodman Again, another novel that could have been commissioned for my birthday. It combines so many of my favorite things! Haunted house + Hudson Valley + writers + unreliable narrator = happy Luisa. The Elementals, by Michael McDowell A haunted house story in high Southern Gothic style. You think your family is dysfunctional? Read this book, and you’ll feel like you’re part of the Brady Bunch. Images of the mansion Beldame, sitting on a desolate beach in Alabama, will stay with you. As I look over the list, I see interesting patterns — lots of English novels, mostly books by women. I go more for the atmospheric than the graphic. Many of the books are set in England; those by Americans take place on the east coast or in the south. There doesn’t seem to be much of what I like set in here in the West. Clearly, I need to invent California Gothic…. I also find that in trying to encapsulate the essence of these books, I want to re-read all of them. Ah, Happy October to me! And to you. Let me know what you would add to the list. I’m always up for a new Octoberish read.
https://luisaperkins.medium.com/13-octoberish-books-ace7320e4a02
['Luisa Perkins']
2018-10-04 16:59:38.245000+00:00
['Melancholy', 'Books', 'Reading', 'Halloween', 'Book Review']
10 Tips and Tricks to Boost Your React App’s Performance in 2020
2. useState Lazy Initialization With Function There are times when we need to set the initial value of the state from some variable or from a function that returns a value. Let’s take a look at the example below: const initialState = someFunctionThatCaluclatesValue(props) const [count, setCount] = React.useState(initialState) Since our function is in the body, every time a re-render happens, this function is getting called even if its value is not required (we only need it during the initial render). Let’s see how to lazy initialize useState with a function: const getInitialState = (props) => someFunctionThatCaluclatesValue(props) const [count, setCount] = React.useState(getInitialState) Creating a function is very fast. React will only call the function when it needs the initial value (which is when the component is initially rendered). So even if the function is taking a lot of time, it will be slow on the initial render only. This is called “lazy initialization.” It’s a performance optimization. Let’s look at this GIF to get an idea of what that looks like: Lazy function vs. ordinary function To play around with the example above:
https://medium.com/better-programming/10-tips-and-tricks-to-boost-your-react-apps-performance-in-2020-9388159f6ebf
['Harsh Makadia']
2020-08-31 17:24:17.501000+00:00
['Programming', 'JavaScript', 'Reactjs', 'React', 'Nodejs']
The Moment We Stopped Talking as Human Beings
The Middlemen Connumdrum Many times we get frustrated, sending emails and making calls to people of the working class. They often don’t answer our calls or emails. If they do, they will take days. When they finally reach you, there’s a chance they didn’t answer all or any of your queries. Then, when you send in another message, you repeat the whole process. Photo by Priscilla Du Preez on Unsplash ‘Hi, this is Wei Xiang. I’m looking for Christina’. ‘Yes, please hold on. I’ll transfer you to her’. I waited. The line got cut off. I called again. ‘Hi, this is Wei Xiang. I called just now but we got cut off. I’m looking for Christina’. I waited. The line got cut off. I called again. ‘Hi, this is Wei Xiang, again’. ‘Yes, I believe Christina is not in. Can you try again later’? Case in point. I eventually got to Christina the next day. That day, I demanded that they gave my number to her and ensured that she got back to me. It is odd, isn’t it? That somehow, in the corporate world, we can’t seem to contact the person we want to. And for some arbitrary reason, it’s not that person who decides if they want to talk to you. In short, some random middleman gets to decide if you two get to talk. Here’s another ludicrous example: ‘Hi, is this KHY health centre’? ‘Yes it is, how can we help’? ‘My brother (me) is terribly ill. He needs to see the doctor. Is she in’? ‘Yes, but it’s 3:30 and we close at 4. Can you come tomorrow instead’? ‘We can make it before 4. Please, can you check with the doctor’? ‘I’m sorry, please come tomorrow’. Guess I’ll die then. Neither the doctor nor the patient gets to decide if we can see each other. The front desk did. We call these middlemen by many names: agents, dealers, assistants, clerks, retailers etc. They’re hired for various reasons. Agents and retailers distribute products and services. The front desk directs queries to relevant departments. In today’s capitalist world, we need middlemen to ensure products and services remain competitive. They are vital to ensure that these products and services are aimed at the right consumers. They also ensure that our demands are met with the right source of supply. They bring us convenience, and we pay good money for said convenience. They’re also a pain to talk to. Why? After talking to so many middlemen, I think the answer is simple. They don’t know everything. They can’t. No one can. If you work for Nike, I wouldn’t expect you to know the fine details behind all of Nike’s shoes. But the reason behind the pain and frustration of talking to these middlemen is that they either assumed they know everything or that they don’t admit to not knowing. When I called the NGO asking if I could volunteer at their place, human resources didn’t know if the part I wish to volunteer for is available. They were in charge of hiring and recruitment. But they didn’t know if there’s even a spot to volunteering as a teaching assistant. Yet, they don’t admit of not knowing. They made promises to get back to me. When the time came, walking away from someone silently is easier than apologising. When I emailed immigration, they didn’t know the guidelines and processes behind evaluating applications. The front desk forwarded the email, yes, but whoever received the forwarded email didn’t know. Instead of saying so, she’d rather send a ridiculous response. Playing dumb is easier than admitting ignorance. When Madam Tan promised to call me back after making her supposed phone calls, she didn’t know that her phone calls wouldn’t give disheartening news. She didn’t know she couldn’t help me. When the time came, remaining silent is easier than admitting you’re powerless. When I called for Christina, the reception didn’t know if she was in. After repeatedly getting cut off, he ‘believed’ Christina isn’t in. He thought he knew. When my brother called the clinic, the front desk didn’t know if the doctor could see us or if she wanted to see us. She thought she knew everything. Even what’s best for the patient and the doctor. So, this is the middlemen conundrum. When they do their work, our world operates. We get responses from the correct person, we get the products and services we so desire, and we get the convenience we paid them for. In a world where information, products and services are so readily available, we cannot do without the middlemen. But when we try to talk to them as a human being, appealing to their empathy and vulnerability, they suck at it. They suck real bad.
https://medium.com/illumination/the-moment-we-stopped-talking-as-human-beings-d86ccce14a9a
['Wei Xiang']
2020-12-14 12:23:24.173000+00:00
['Self-awareness', 'Personal', 'Talking To Strangers', 'Thinking', 'Self Improvement']
Guest Posting on devconnected
Before submitting your article Before submitting your article, make sure that the following points are met: The subject of your article is related to at least one of the points above . . Your article is well writen without too many misspellings . without too many misspellings You agree not to remove your article once published. Are those three points met? It is time to publish your article on devconnected. Submitting your article Are you ready to publish your masterpiece? We made the process easy by creating this three questions form for you. Once you submitted your article to us : You will be added as a writer to devconnected. You will add your story to the publication (using the following tutorial) Publication time! Your story will be visible on devconnected. Thank you! Antoine
https://medium.com/schkn/guest-posting-on-devconnected-1607602ef960
['Antoine Solnichkin']
2019-06-05 18:59:37.154000+00:00
['DevOps', 'Programming', 'Technical Writing', 'Blog', 'Writing']
Where are we in the Crypto Bubble?
Where are we in the Crypto Bubble? A 1996 conversation with Steve Jobs about the pending Internet boom just might give us a hint. The January 1996 cover of Red Herring magazine. Eleven months later — in a stunning move — Apple announced it would purchase Next Software for $400 million and bring Steve Jobs back to the company he cofounded. Author’s note: People assume my relentless campaign in 1999 to warn people that the Internet stock bubble was poised to burst was my best call as an innovation industry observer. But the call I am most proud of was our January 1996 Red Herring cover with Steve Jobs sitting poised in front of his I.M. Pei staircase at NeXT’s headquarters in Redwood City. This legendary interview, conducted just days before Pixar’s IPO, was later cited in Walter Isaacson’s authorized self-titled biography book of Steve Jobs. Can we compare Mr. Job’s state of knowledge about the Internet opportunity in 1996, with the knowledge we have today about the blockchain boom? Let’s give it a try. JOBS STORY By Anthony B. Perkins Red Herring, January 1996 Steve Jobs was happy to tell us that Pixar Animation Studios is “the only true digital studio in the world.” Pixar gains this distinction, of course, by being the first digital effects house to actually produce a full-feature film entirely on a computer. Standing in front of a packed house at the San Francisco premiere of Pixar’s new blockbuster, Toy Story, Mr. Jobs was at his best. He told a story of how his four-year-old son watches the Disney classic Snow White and the Seven Dwarfs over and over again. “The thought of participating in the production of a classic film such as Snow White, which our grandkids may watch in 30 years, is what’s exciting for me.” What is also special is that, on the heels of its first movie release, Pixar went public at an astounding market valuation, flying by the $1 billion mark by the end of its first day of trading. The preceding day, power journalist John Markoff of The New York Times wrote a story on the comeback of Mr. Jobs that landed on his paper’s front page under the headline “Apple Computer Co-founder Reaps a Billion Dollars on Stock Issue.” Mr. Jobs reached this billion-dollar zenith by buying Pixar from movie mogul George Lucas 10 years ago for $10 million, then pumping an additional $50 million into the company to keep it puffing until it finally showed a profit last year. Whether Pixar’s whopping market valuation holds up or not, Mr. Jobs deserves all the success he can garner for hanging in there with his fledgling company. Few venture capitalists we know would have done the same. This magical success is neither the beginning, nor will it be the end of the Jobs story. In 1976, at the age of 20, Mr. Jobs co-founded Apple Computer. Over the following decade, Mr. Jobs designed the Apple II, and led the development, manufacturing, and marketing of the Macintosh and LaserWriter, two products that still make up the lion’s share of Apple’s $10 billion revenues. In 1985, Mr. Jobs left Apple to found NeXT Computer, a pioneer in object-oriented software. While the NeXT story is not nearly as glamorous as Pixar’s, we think the company is going to have a hugely successful year in 1996, and may also go public soon. [The Herring got a full update on NeXT in our below with Mr. Jobs.] Mr. Jobs is betting NeXT on two new initiatives, by building software development frameworks for both Windows and the World Wide Web. When we met up with Mr. Jobs for the interview, we noticed that some of his legendary idiosyncrasies still exist. The minute we showed up, he slipped out the back door “for a walk” — for 45 minutes. His PR people still dote on him. When our photographer tried to take photos during the interview, he snapped at her sarcastically and made her stop. Manipulation, selfishness, or downright rudeness, we couldn’t figure out the motivation behind his madness. But he gave us a great interview, so we don’t really care. What we do care about is that Steve Jobs is an original. He’s a guy who loves “people who work their butts off on a concept…and bring it to market.” It is very clear that he isn’t in business for the money, but for the opportunity to create, as he used to say, “insanely great” products. Over the course of the couple of hours we talked with him, he spoke proudly of the Mac, he described several of NeXT’s new technologies as “far-and-away the best in the world,” and his eyes sparkled when we asked about Toy Story. There is a slight difference in Mr. Jobs’ demeanor these days, however. He has ventured out of the infamous “Steve Jobs reality distortion field.” He understands now that the best technologies don’t always win. He thinks that the biggest issues that will determine the fate of the Web are not technical issues, but geopolitical. He also made several references to being “older” (he just turned 40), and even ironically commented that although Netscape co-founder Marc Andreessen is very smart, he is still “very young.” British psychologist and author Anthony Storr points out in his book Solitude — A Return to the Self that there is a fine line between the madman and the genius. Both require a rich fantasy life; to be the latter requires that you keep at least your big toe on the ground. Over the years, Steve Jobs has been both. But today he seems more firmly grounded in reality, and it is beginning to pay off. No matter what his plans are for the future, we hope he keeps pursuing his imagination. All you have to do is see Toy Story, or see a demo of WebObjects, and you’ll know what we mean. WHAT’S NEXT? While other CEOs are talking about ideas, Steve Jobs, CEO of NeXT Computer, says he’s talking about reality. In 1976, at the age of 20, Steve Jobs co-founded Apple Computer, where he not only built the Apple II, but helped develop and manufacture the Macintosh and LaserWriter. Since ’85 his entrepreneurial energies have been focused on running NeXT Computer, a leader in the object-oriented software market. Mr. Jobs spent an afternoon with Red Herring philosophizing about Netscape versus Microsoft, the Internet promise, and the pros and cons of youthfulness. Perkins: Beginning when Jim Clark met Marc Andreessen and founded Netscape, every technology executive seems to have had their own “road to Damascus” experience regarding their newfound faith in the Internet. How long have you been thinking about it? Jobs: For probably seven or eight years. I don’t know if you have tried the NeXT e-mail system, but it is really the best in the world. So we’ve been using the Internet for a long time to send mail to people. They never did that at Apple. The more relevant question, I think, is when did we start recognizing the value of the World Wide Web. NeXT has had a long association with the Web. Tim Berners-Lee, the European physicist who led the team that developed the original foundation for the Web, used NEXTSTEP. So we were somewhat exposed to it from the very beginning. But I don’t think we quite got it until maybe two years ago. That’s when we started to see that the Web was going to be phenomenal, and it was going to change the way people think of computing. The World Wide Web was invented by English scientist Tim Berners-Lee in 1989. Perkins: How so? Jobs: The old way to look at computing was as a straight line between the desktop and the enterprise, with the primary focus on improving desktop productivity. That world, as we all know, is owned by Microsoft. But the Web is changing all of that. One way to view the Web is as the ultimate direct-to-customer distribution channel. At least that’s how NeXT looks at it. Now who cares about that? Businesses! Suppliers! They are the people who can best leverage the Web by using it to conduct business and make money. So the Web completes the computing loop by providing businesses with a new way to interact with their customers. Perkins: Perhaps ironically one could say it’s like the old mainframe computing model, but with all of your customers hooked into your network, too. Jobs: Exactly! The browser is just a 3270 terminal [IBM workstation] on multimedia steroids. Right? Perkins: What do you think about Netscape’s vision that someday soon we will all be automatically hooked to the Net when we boot up our computers, and their Navigator platform will be our primary interface to the world? Jobs: I wish the world could work that easily, but it doesn’t. You are talking about ideas, I am talking about reality. Look, I love Marc Andreessen, he’s a great guy. But he’s young, and he’s got Microsoft to deal with. Perkins: But Netscape does have 10 million customers using the Navigator, and that is reality. Jobs: Yeah, but they give it away. They have probably made $20 million off their browser business. Do you think they will make a lot of money on 2.0? It just ain’t gonna happen. They don’t have 10 million customers anyway, they have four million. Perkins: We haven’t personally checked the numbers, but Netscape claims to have a system in Mountain View that identifies users every time they fire up Navigator, and so they can verify those numbers. Jobs: Okay. So maybe they do. And I think that’s wonderful. But, by the way, I couldn’t give a shit about the browser. We are not going to make any money by selling browsers, and I personally don’t think they are going to make any money from it either. If you can get a browser from Microsoft for free, why are you going to pay $39 to Netscape? Perkins: But if Navigator is platform-independent, and… Jobs: But everybody uses Windows. Come on — 90%+ of the people use Windows, so 90%+ of the people are going to hook into the Internet using Microsoft. Now, you know me, I love the Mac too, but I am trying to be really objective here. Plus Microsoft is… Perkins: …busy making all its apps Web-friendly. Jobs: Microsoft is busy trying to kill Netscape. And it has a certain track record of being successful at those kind of things. So I wouldn’t write off Microsoft right now. But all I am trying to say is that no one is going to make money by selling browsers. I do think a lot of people are going to make money off the pipes, but that ain’t us. The pipe is going to be owned by the RBOCs. Pac Bell and all those guys are going to provide cheap ISDN lines into the home that come with a little box that turns it into Ethernet, and they are going to be impossible to compete with. But, as we’ve been talking about, the new Web set-up is just like the mainframe computing model, where all the apps will run off the server, and these will mostly be custom apps. Perkins: Enter NeXT Computer. Jobs: Well, as it turns out, the businesses that can best use the Web are the exact same people we have been talking to for several years about NEXTSTEP and Enterprise Objects. Those customers now have a real need to build custom apps on the Web so they can vend products, information, and services to their customers. As we started to think about it, we came up with four categories of things these customers are going to want to do with the Web. [Mr. Jobs gets up and starts drawing on his whiteboard.] One, they are going to do static publishing. That’s where somebody makes a Web page and vends it. Anybody can look at it at the same time as 3,000 other people, and it doesn’t change until someone goes in there and changes it by hand. The second thing people will be able to do, which is going to be a lot more exciting, is what we call dynamic publishing. There are already a few examples of people doing this — like the Federal Express package-tracking Web page. You give it a number, it goes into four or five different databases and finds the information you need, and then presents it to you so you can browse it. Now there isn’t a little gnome in there that makes up this page for you, the computer makes the page for you. It’s a custom page answering your custom request, dynamically created on the fly. And this is just the tip of the iceberg. Have you seen our Chrysler demo? Perkins: Only when you demonstrated it on CNN. Jobs: I will show it to you in more detail in a few minutes, so you can have a better idea of what I’m talking about. The third big application for the Web will be commerce. The security issue here is the red herring, so to speak. And frankly, it is going to take Visa and MasterCard to solve this problem. Netscape can’t solve this problem; we can’t solve it. Before you start sending your credit card number all over the Web, you want someone to guarantee that if there is fraud, you aren’t going to be held responsible. And who can guarantee that? Not Netscape. Only Visa can say that, and it will solve that problem. The real issue here, however, is that if you are going to sell something over the Web, and you’re a medium-to-big company, you’ve got to have an order management system. But guess what? You already have an order management system that you’ve been running your company on! So, to be efficient, you need to tie the Web into your existing order management system. It becomes multi-platform that way, right? Perkins: Right. Jobs: Now you are constrained with the UI [user interface], but a lot of apps can be written in constrained UI. Look at the number of 3270 apps that have been written in the world and are still used. So, if you can constrain yourself to the existing UI today and write your app, not only will you get multi-platform capability, but you can roll your apps out to the contracted agents working for your company, and eventually out to your end-customers. For example, Merrill Lynch works with over 10,000 people who do not work directly for them, but help Merrill Lynch sell its products and services. Perkins: How long do you think it’ll be until this Web-centered world fully comes into play? Jobs: Static publishing is already happening today. Dynamic publishing is just beginning to happen, but is really going to be the big thing in 1996. Web-based commerce should also start kicking in 1996, and, in my opinion, building internal apps for the Web won’t really get going until 1997. When we looked at these developments, we realized that the final three require custom software. And that’s what we do here at NeXT, Custom ‘R’ Us, right? So we created this thing called WebObjects to help make it easier for people to build custom apps for the Web. For example, it took Federal Express four months to build its Web site — using WebOjects, you could build that same site in four hours. [Mr. Jobs then showed us the Chrysler Corporation Web site that the NeXT team built with WebObjects and an Oracle 7 database. During this demonstration, Mr. Jobs searched the site for several specific models of cars, at different price ranges, in different colors, and sorted in different ways, and each time he was instantly presented with a Web page that included all the cars he had requested. He also showed how the NeXT team had built a custom function into the site that allows customers to calculate their own financing options and identify which dealers have the exact models they are looking for.] Perkins: That’s pretty cool. How did you do that? Jobs: It takes your request, parses it in WebObjects, grabs all the data, and dynamically builds it into a Web page for you to browse. The way we set up the car financing feature is that it actually sends an OLE call to another Windows computer that launches an Excel spreadsheet that does the calculation for you, and then OLE messages the information back and shoves it onto the Web page. You can’t possibly do this in a static environment. I would think that this site is, what, an order of magnitude or two more dynamic than any other Web site out there right now. Wouldn’t you agree? And we set up the whole Oracle database, we built the whole app, we scanned in all the pictures — everything, in about 48 hours with four people. And, to reinforce something I talked about earlier, the site I just showed you will help Chrysler sell cars, because it distributes information to customers far better than Chrysler’s dealers can. Perkins: And pretty soon, with 3D immersion on the Web, you’ll be able to get in the car and test drive it. Jobs: I personally don’t think that will happen for a long time. But what will come soon, when we have MPEG decoder chips in every computer, is the ability to download a high-quality video so you can watch the car drive around. Perkins: Does WebObjects work across all platforms? Jobs: It’s very portable. It can run on our Mach operating system, it runs on Solaris, H-P UX, Digital UNIX, and it now runs on Windows NT. It’s also fully distributed, so you can have objects on different machines and one object can send a message to another object without even knowing where it is. In fact, you can move an object from one machine to another without ever changing the app, it just automatically works itself out. Perkins: It also seems to work pretty seamlessly with the Oracle database. Jobs: We discovered over the years that almost all mission-critical apps make extensive use of databases. So we tried to figure out the coolest way to integrate data sources with objects, and we came up with this thing called the Enterprise Objects Framework [EOF]. In essence, EOF allows you to graphically connect the data-structures in your objects with any SQL database, and it will automatically — automatically — make the data in your object persistent and coherent with that data in your database without any programming. And it is exceptionally powerful. EOF is far-and-away the most aggressive database technology out there for objects. It’s really slick. With EOF, you don’t have to know about SQL. It has full TCP/IP communications built in, so you don’t have to know about that. It also has Sybase and Oracle client libraries in it, so you don’t have go out and buy those. You literally just point it at the database on the network and it works! So when we wrote the WebObjects framework, we based it on our experience with EOF. What that means is that your object doesn’t have to know anything about the Web — it literally doesn’t have to know anything about its UI, it doesn’t have to know about HTML, it doesn’t have to know about URLs. And, like EOF, it doesn’t have to know about the database or the connectivity. Everything is taken care of automatically. So WebObjects is far ahead of anything anybody else is doing out there. Perkins: Why do you think you have built up such an advantage? Jobs: The reason we are ahead, I think, is because our understanding of the fundamental business model of the Web is more advanced than Netscape’s, or that of anybody else we’ve talked to. We’ve spent eight or nine years developing PDO [Portable Distributed Objects, NeXT’s object model], we’ve spent four years developing EOF, and we have just leveraged that by spending about 1.5 years developing the WebObjects framework. The other guys haven’t even gotten started yet. Perkins: When will WebObjects ship? Jobs: WebObjects is in alpha right now, it will go into beta by the end of this year, and we are shipping in production in the first quarter of 1996 — my guess is by February. Perkins: Do you worry about Microsoft? Jobs: My goal over the next few years is to stay far ahead of Microsoft, until the Web is so ubiquitous, that even Microsoft can’t own it. Perkins: Could a Windows-compatible-only Web strategy become Microsoft’s Achilles’ heel? Jobs: I gotta tell you, multi-platform compatibility ain’t what it used to be. Windows has won. It beat the Mac unfortunately, it beat UNIX, it beat OS/2. Perkins: But it took 10 years. [Laughs] Jobs: We can all laugh at how long it took, but then we can all cry about the fact that it did happen. An inferior product won, but it won. And there is no changing that. I still think multi-platform is important, but not as much as it used to be. Perkins: Netscape’s vision is that multi-platform capability is important, and… Jobs: Wait a minute, let’s zoom back for a moment. It’s not Netscape’s vision, it’s Tim Berners-Lee’s vision. His original idea was that the Web would become the circulation system connecting us all together. Netscape embraced this vision, and it has done a better job than any other company in doing so. But as we all know, Microsoft has embraced the same vision Netscape did two years ago. So to state that Microsoft and Netscape have diametrically-opposed views would be foolish. They have both bought into the Tim Berners-Lee vision, and Microsoft is going to be a force on the Web, whether you like it or not. Look, I remember the day when Microsoft entered the application business for the first time — its first programs ran on the Mac, not the PC. It was back in 1984, when we launched the Mac. Today, half of Microsoft’s revenues come from application software, and it is the leader in that business. Now I am not a cheerleader for Microsoft, but I think it would be stupid to think it isn’t going to be a big player with the Web. And don’t get me wrong. I take my hat off to Netscape. I love Netscape. Perkins: Why do you love Netscape? Jobs: I love Netscape because I love any group of people willing to work their butts off for 18 months to get something done and take a new concept to market. I love that! Perkins: How about Sun’s Java software? Jobs: My view is that putting a programming language like Java in the client will slow the Web down, and allow Microsoft to catch up. So while NeXT thinks Java is a fine language, and eventually it would be great to see it in the client, I actually feel that for the good of the Web, and for the good of the industry, the Web ain’t broke, so let’s not fix it. The most important thing right now is to let the Web accumulate users and establish ubiquity, until it’s so entrenched that even Microsoft can’t own it, and then let’s add in all the cool stuff. Now I am not denying that the UI on the Web strains its use, but I am a little worried that in the microcosmic lust for perfection, macrocosmically we will give Microsoft the time it needs to own the Web. I hope that doesn’t happen. Perkins: We suppose that’s one way to look at it. Jobs: As an example, I predict that by the end of this year, Microsoft will announce that it has a Visual Basic variant or deviant that it proposes as the Web-client language. And Sun and Microsoft will have a war. And Microsoft will put everything it has into that war, because if it can win, it will have killed Netscape along the way. Netscape will put everything it has into that war, because if it loses, it is in trouble. So I ask you, who will win that war? Probably Microsoft. I hate to say it, but it has a lot more resources. So, in a way, Java may be the undoing of some very good things that are happening with the Web right now. I want to emphatically say that I like Java, but I am looking at it from a geopolitical perspective, not a technology perspective. So having said all of that, WebObjects works perfectly with Java. [Laughs] Perkins: In our interview with Jim Clark, he said that Java “has a facility to protect you against the transmission of viruses, and a cryptography envelope that can wrap around and protect programs delivered over the Internet.” Jobs: Well, that is just not a true statement. We know a lot about cryptography here. We have invented far-and-away the best public key encryption technology in the world outside of what’s inside the NSA. It blows RSA away. We have been told by people who know. Therefore we know that any language in the client is going to be susceptible to viruses. Perkins: Any thoughts on @Home? You’ve already stated that the RBOCs will own the pipeline. Jobs: I think several of the RBOCs will make a lot of money selling unlimited use of ISDN and an Internet account for $20 to $25. That’s reality, that’s product you can have in your home in January, and @Home is talking about a cable modem product that I may be able to get in a year or two. I mean, fine, asymmetric cable modems are very interesting. But all I’m saying is that you have these multi-billion RBOCs who already have customers in every home in their territory, and they have trucks with people who can install new products, and they have bought a zillion servers and have set up Internet farms, and they are ready to roll. I’m just mentioning that that is a fact. Perkins: What about Apple? Jobs: Well, I love Apple. I hope they make it. Perkins: You sound concerned. Jobs: [Shrugs] Perkins: So what is NeXT’s growth strategy? Jobs: We have three major new things coming out in the next nine months. One is D’OLE, the distributed OLE product which will be out by the end of the year. We have WebObjects shipping in the first quarter, which takes us into the Web market in a very big way. And we have OpenStep for Windows shipping in the second quarter. Our big initiatives then are really the Web and Windows. With the introduction of our Windows product, we are really going from having 10% of the seats available to us, to having over 90% of the seats available to us. And we think WebObjects should become pretty big, because we think the Web is going to be pretty big. Perkins: Is an IPO in the near future for NeXT? Jobs: We don’t have to go public, but there are other issues such as employee liquidity and credibility with our customers. So I see an IPO sometime down the road. Perkins: Closing comments? Jobs: The Web is great because it breaks down two big barriers. It breaks down the platform barrier, because it is multi-platform, and it breaks down the internal/external barrier. Small to medium-sized customers will be able to share information seamlessly across the Web with their customers, and that should increase everybody’s productivity. But, again, I think the biggest issues that are going to determine the fate of the Web are not technical issues, they are business and political issues. Maybe I am getting too old, but that’s what I think.
https://tonyperkins.medium.com/steve-jobs-hes-back-1182c2eaf15f
['Anthony Perkins']
2018-07-31 16:10:49.571000+00:00
['Red Herring', 'Pixar', 'Apple', 'Next', 'Steve Jobs']
Royalty and Literature in ‘The Uncommon Reader’
Royalty and Literature in ‘The Uncommon Reader’ The Queen of England falls in love with reading in Alan Bennett’s charming novella Photo Credit: Shadesofwords.com © Alan Bennett is most famous for his Olivier & Tony Award-winning play, The History Boys, a dramedy on the trials of high school boys heading to college. With The Uncommon Reader he veers into lightweight literary territory, distinct in his charm but low on gravitas. In The Uncommon Reader, we enter the royal household of England, where the Queen has just discovered the joy of reading on a visit to a mobile library. To accept a plot based on the premise that a monarch and diplomat, with all her fancy education, has not discovered literature requires a significant level of credulity. It is tempting to give up on the flippant story in the early pages of this short novella, but from one reader to another, there is nothing more riveting than to watch someone else fall in love with books. The Queen’s foray into reading is aided and abetted by a kitchen boy, Norman who acts as per personal curator for all things literary. She discovers that “Reading is untidy, discursive and perpetually inviting”, a sentiment I cannot disagree with. Photo by Annie Spratt on Unsplash As we track her journey through a diverse selection of literary volumes, what drives the plot forward is everyone’s reaction to her reading. A seemingly harmless hobby is less than welcome. The Prime Minister and her personal advisers are distinctly uncomfortable with this unproductive habit. They are worried that the Queen’s increasing intention of reading books out at diplomatic events will cause some political scandal. Since they seem to have little control over the content she reads, they fear her reading tastes may be misconstrued as political campaigning or criticism. It’s twenty first century, and people still fear the power of books. “It was reading, and love it she did, there were times when she wished she had never opened a book and entered into other lives. It had spoiled her. Or spoiled her for this, anyway” — Alan Bennett The Queen suffers from another side-effect of reading, one that many others have suffered from since the beginning of time. She loses interest in any other worldly pursuits and is seized with the feeling that there isn’t enough time to get through all literature worth reading. And like most readers before her, she goes through the inevitable transition from being a reader to a writer; when she realizes that ‘Reading is not doing…’ and she needs to find her own voice. Photo by Priscilla Du Preez on Unsplash While the novella doesn’t pretend to be anything more than what’s in black and white, it’s ironic that the reading journey of the uncommon reader is rather a common one. Once you fall for books, it’s not so much the choices of what you read and what it means to you, but a universal satisfaction that comes with reading. It’s that kindred spirit that unites all readers in the word. The same spirit that unites the kitchen boy and the queen in an unusual friendship, the plebeian and the royal. Is there any pursuit in the world that opens up the doors of the mind as reading? I wonder if Alan Bennett could have created the same charm if the Queen was to have discovered and loved knitting? Would it throw the Prime Minister in the same confusion if the Queen wanted to talk about purls and stitches on national television, as she wanted to talk about Dickens? I don’t think so. The real protagonist of this story are the books. Check out other bookish articles by Vipula Gupta
https://medium.com/books-are-our-superpower/royalty-and-literature-in-the-uncommon-reader-8fb341091389
['Vipula Gupta']
2020-12-06 14:56:01.581000+00:00
['Books', 'Reading', 'Literature', 'Culture', 'Book Review']
Python Financial Stock analysis (Algo Trading)
In this article we will dive into Financial Stock Analysis using the Python programming language and the Yahoo Finance Python library. This tutorial covers fetching of stock data, creation of Stock charts and stock analysis using stock data normalization. The implementation will take place within the Jupyter Notebook which we will install using the Anaconda data science platform. So without further introduction lets get into the actual implementation. The following implementation is divided into three parts. If you get stuck at some point or just want to view this tutorial in another format, a Youtube link is attached to each section, where you can see the actual implementation being executed. Installing Anaconda and Yahoo Finance YouTube: https://youtu.be/8gGV6eGp9IQ If you already have Anaconda installed on your system then you can skip this part. Navigate to https://www.anaconda.com/products/individual and download the latest version to your Operation System (Windows, Linux or Mac OS). Follow the installation tutorial — requires around 500mb of free space. Once installed, open Anaconda navigator and ensure you can access Jupyter. Anaconda Navigator Next open the Anaconda Prompt Anaconda Prompt Execute the following to install the Yahoo Finance library. pip install yfinance To ensure that you have installed Yahoo Finance you can execute the following command. show yfinance Fetch Stock data YouTube: https://youtu.be/8gGV6eGp9IQ Once Anaconda and Yahoo Finance has been installed we’re ready to create our first Notebook. It’s in our Notebook that we will write and execute our Python code. To create a Notebook Navigate to Jupyter and click New to create a new one. In a Jupyter Notebook its possible to write Python code directly in between plain text. You execute a block of Python code by clicking the following button. Jupyter Run Button Next we will Import the following libraries, these is to be used throughout this guide. import pandas as pd import yfinance as yf import matplotlib.pyplot as plt Following the import of libraries a list of companies is defined based on the ones that we wish to fetch stock data from. For this guide we will fetch data from Apple and Google. stocks = [“AAPL”, “GOOGL”] You can stick with these or select stocks from other companies. This is done by navigating to Yahoo Finance or Google and search for the company. Here you will see their stock initials e.g. Apple Stock code Once you have defined a list of companies, the Yahoo Finance library will be used to fetch data for a given period, this is done using the following line: data = yf.download(stocks, start = “2020–01–01”, end = “2020–11–28”) When using the yf.download you need to provide the following three parameters Stock list, Start of period and End of period. If you wish to look at the data fetched you can write: data.head() Fetched Stock data displayed using data.head() Creating Stock Charts YouTube: https://youtu.be/MwBtuEdFeBI Now that we have fetched stock data based on a list of companies in a given period, it’s time to visualize the data by creating a chart as illustrated below: Stock Chart The first step in the creation of a chart is to define the parameters that the chart is to include. If you’ve tried to print the fetched stock data using data.head() you will notice that our dataset consist of the following: Adj Close, Close, High, Low, Open prices. In this series we will focus solely on the closing price on a given trading day. To do such we need to create a new variable where we copy the closing data of our original fetched data. This is done using the following line: closedStocks = data.loc[:, “Close”].copy() With our closed price dataset in place, its time to specify the parameters of the chart that is to be used. We will define the size of the Chart, the font size and the style. Feel free to adjust these parameters as they are purely cosmetics. closedStocks.plot(figsize = (18, 8), fontsize = 15) plt.style.use(“seaborn”) With the styling parameters set, we’re left with the last and final line that will present our closed price stock data visually. plt.show() Normalize Stock Data YouTube: https://youtu.be/COwS2JcZ2tM If we’re to look solely on our stock Chart it might seems that Apple is doing far worse than Google in regards to performance, but the stock chart can be misleading as it does not account for different price ranges in the period e.g. Apple is starting at 7.6 US dollar and Google at 313 US dollar. What we want to do is to normalize the data in order to get a better overview of the price performance over time. To Normalize the data we divide the series by the first element then multiply by 100, this will make all stocks in our chart start at 100 and display the percentage change each day from the start to the end of our chosen period. The normalization is achieved using the following line: norm = closedStocks.div(closedStocks.iloc[0]).mul(100) Following this we can either choose to reuse our previous defined chart or create a new one to illustrate the difference. For the sake of this example we will define a new plot chart. norm.plot(figsize = (18, 8), fontsize = 15) plt.style.use(“seaborn”) plt.show() Normalized Stock Chart If we compare the two charts its clear to see that the Apple stock is actually performing much better than the Google stock when looking a price increase over time. Using this approach we can start compare and analyze the performance of different stocks.
https://medium.com/vinsloev-academy/python-financial-stock-analysis-algo-trading-4d5304d07416
[]
2020-12-17 07:09:16.657000+00:00
['Data Science', 'Python', 'Finance', 'Stock Market', 'Algorithmic Trading']
A Perfect Sleep…
Falling asleep between your arms with my cheek against your chest, Enveloped in the warmth of your love, Perfect dreamless sleep…
https://medium.com/3-lines-story/a-perfect-sleep-2ec019988ab3
[]
2017-09-01 14:06:56.208000+00:00
['Storytelling', 'Relationships', 'Love', 'Poetry', 'Dating']
Screening + Drug Treatment = Increase in Veteran Suicides
By Robert Whitaker & Derek Blumke The alarm over suicides of military veterans has been regularly sounded over the past 15 years, prompting the U.S. Department of Veterans Affairs to declare that “preventing suicide among Veterans is the VA’s top clinical priority.” The VA’s 2019 report on suicide provides reason to sound the alarm again, for it tells of a suicide rate that has continued to climb, particularly for younger veterans who have served since 9/11. Indeed, a close review of VA data provides reason to conclude that the rise in suicide is being driven, at least in part, by the VA’s suicide prevention efforts. Its screening protocols have ushered an ever greater number of veterans into psychiatric care, where treatment with antidepressants and other psychiatric drugs is regularly prescribed. Suicide rates have increased in lockstep with the increased exposure among veterans to such medications. According to the 2019 report, the age-adjusted suicide rates for all veterans rose from 25.5 per 100,000 population in 2005 to 35.8 per 100,000 in 2015, a 40% increase. The rate for veterans using Veterans Health Administration (VHA) facilities rose from 29.6 per 100,000 in 2005 to 40.1 in 2017, a 35% increase. The increase in suicide was even more pronounced for veterans 18 to 34 years old. The suicide rate for this group increased from 25.3 per 100,000 in 2005 to 44 per 100,000 in 2017, a 74% increase. For this age group using VHA facilities, the suicide rate rose from 20.9 per 100,000 in 2005 to 51 per 100,000 in 2017, a 144% increase. In a previous MIA Report, we made the case that “suicide prevention efforts,” which are based on the premise that suicide is a “public health” problem that can best be addressed by getting more people diagnosed and into treatment, have fueled a steady rise in suicide rates in the United States since 2000. An investigation into the rise in suicide among veterans can be seen as a “replication” study. In the VA’s data, is it possible to identify the same treatment-related forces at work? There is, buried in the VA’s 2016 suicide report, data on the suicide rates for treated and untreated patients that make this a question of particular urgency. Suicide in the Civilian Population The initiation of suicide prevention efforts in the United States began in the late 1980s, shortly after Prozac-the first “selective serotonin reuptake inhibitor” antidepressant-came to market. The American Foundation for Suicide Prevention was established then, and although it promoted a “public health” message, it was a non-profit that was funded, to a significant extent, by pharmaceutical companies, which understood that a “suicide prevention” campaign would boost sales of their drugs. Academic psychiatrists with financial ties to the makers of antidepressants led its scientific advisory board and served terms as directors of the foundation. The American Foundation for Suicide Prevention promoted a very simple message to the American public. Suicide was said to be a “public health” problem that was “under-recognized.” Many people with mood disorders went “untreated” and this was the population at particularly high risk of suicide. People who were feeling depressed or suicidal were urged to seek help from mental health professionals. The Foundation pushed screening programs as a way to get more people into treatment. Its advisory board and presidents touted antidepressants as “anti-suicide” pills. “Use of antidepressants to treat major depressive episode is the single most effective suicide prevention measure in Western Countries,” said Columbia University psychiatrist John Mann, who had long been a fixture on the Foundation’s scientific advisory board, and served for a time as its president. The American Psychiatric Association, the National Alliance on Mental Illness, and the pharmaceutical companies that sold antidepressants all helped promote this message to the public. In 1997, their efforts prompted both houses of Congress to declare suicide a “national problem.” Two years later, U.S. Surgeon General David Satcher issued a “Call to Action to Prevent Suicide,” and the U.S. Department of Health and Human Services formed a task force, composed of individuals and organizations from the private and public sectors, to develop a “National Strategy for Suicide Prevention.” The task force published its recommendations in 2001, which doubled-down on the “public health” approach that had been promoted by the American Foundation for Suicide Prevention. Government agencies at all levels-federal, state, and local-launched suicide prevention efforts. Crisis call centers were established; depression screening programs were initiated; checklists for assessing the risk of suicide in depressed patients were developed; and medical professionals were trained to recognize the “warning signs” of suicide. The goal was to get more people struggling with mood disorders into treatment, with antidepressants recommended as a first-line therapy. These efforts have been successful in that regard: the prescribing of antidepressants has increased steadily since 2000. Yet, since 2000, the age-adjusted suicide rate for the American population, rather than decrease, has risen steadily, from 10.4 per 100,000 to 14.0 per 100,000 in 2017. The failure of this approach to suicide prevention, which emphasizes getting people into treatment, is not a uniquely American phenomenon. In the 1990s, the World Health Organization urged countries around the world to develop national mental health policies and to improve their mental health services, which included providing their citizens with better access to psychiatric medications. The belief was that this would lead to better mental health outcomes, which would become visible in the form of reduced suicide rates. Researchers from the , and have now conducted three studies of whether such efforts have affected suicide rates, and all came to the same conclusion: improved access to psychiatric services and psychiatric drugs was associated with an increase in national suicide rates. In a similar vein, a large Danish study of suicides in Denmark from 1996 to 2009 came to the disturbing conclusion that the risk of suicide escalated dramatically upon entry into psychiatric care. They found that, in comparison to a matched control group who hadn’t had any involvement with psychiatric care during the previous year, the risk of suicide was: 5.8 times higher for people receiving psychiatric medication (but no other care) 8.2 times higher for people having outpatient contact with a mental health professional 27.9 times higher for people having contact with a psychiatric emergency room 44.3 times higher for people admitted to a psychiatric hospital Although it could be expected that the suicide risk would increase for each step up the “treatment” ladder, what surprised the researchers was that it occurred even in those who were married, and in those with higher incomes or higher levels of education and no prior history of attempted suicide. In an accompanying editorial, two Australian experts in suicide research wrote that the findings “raise the disturbing possibility that psychiatric care might, at least in part, cause suicide.” The Impact of Antidepressants Although the promoters of suicide prevention programs have touted antidepressants as a treatment that reduces the risk of suicide, evidence that the drugs might have the opposite effect arose in the first clinical trials of Prozac. A significant percentage of patients given Prozac suddenly developed suicidal thoughts, and after Prozac came to market, there were numerous case reports of patients prescribed the drug committing suicide. By the early 1990s, with Paxil and other SSRIs entering the marketplace, the FDA was forced to convene a public hearing on this concern. The controversy has circulated within research circles-and in the public mind-ever since. There is clear evidence that SSRIs and SNRI antidepressants can provoke suicidal impulses and acts in some users, and the reason is well known. SSRIs and other antidepressants can stir extreme restlessness, agitation, insomnia, severe anxiety, mania and psychotic episodes. The agitation and anxiety, which is clinically described as akathisia, may reach “unbearable” levels, and akathisia is known to be associated with suicide and acts of violence, including homicide. At the same time, there are many people who tell of how SSRIs or some other antidepressant saved their lives, as their suicidal impulses waned after going on the drugs. Individual responses to antidepressants may vary greatly, and so the public health question is about the net effect of these drugs on suicide rates. On the whole, do they increase or decrease the risk of suicide in people so treated? There are several types of studies that provide an “evidence base” for answering that question. Randomized Clinical Trials (RCTs) RCTs are seen as the “gold standard” for assessing the merits of a drug treatment, at least over the short term, as the trials typically last only six weeks or so. Thus, the trials might be expected to identify whether SSRI and SNRI antidepressants trigger any increase in suicidal behavior when patients are first put on the medication. The FDA, in its review of industry-funded RCTs of SSRIs and other antidepressants, concluded that these drugs increase the risk of suicidal thinking for those under 25 years old; have a neutral effect on those 25 to 64 years old; and are protective against suicidal thinking for those over 64. However, it is well known that the pharmaceutical companies sought to downplay-or hide-suicidal risks in the trial reports they sent to the FDA. In 2003, UK psychiatrist David Healy and a team of Canadian researchers conducted an exhaustive meta-analysis of all RCTs of SSRIs, which incorporated findings from trials that weren’t funded by pharmaceutical companies, and they found that suicide attempts were 2.28 times higher for those treated with an SSRI compared to placebo. Two recent analyses of clinical trial data have led to similar findings. First, after reviewing 64,381 pages of clinical study reports filed with the European Medicines Agency, Peter Gøtzsche and colleagues concluded that antidepressants doubled the risk that patients would experience akathisia, a known risk factor for suicide. Second, European researchers Michael Hengartner and Martin Plöderl conducted an exhaustive review of the trial data submitted to the FDA for all antidepressants approved from 1991 to 2013, a database that numbered more than 40,000 patients, and they determined that suicide attempts were 2.5 times higher for those taking antidepressants compared to placebo. Observational Studies in Primary-Care Patients During the past 25 years, there have been hundreds of studies of different types-observational studies, epidemiological studies, and so forth-that have sought to assess whether antidepressants are protective against suicide, or conversely increase suicide rates. Those studies have produced a welter of conflicting findings. However, the studies that are of most relevance to suicide prevention efforts are those that look at patients diagnosed with depression in a primary care setting, and then chart suicide rates for those who take an antidepressant and those who avoid doing so. That initial choice serves as a fork-in-the road moment, the patients heading down either a medicated path or a non-medicated one. There are at least three studies that provide a comparison of suicide rates for these two paths following a diagnosis of depression in primary care. In a 1998 study of 35,436 people in the Puget Sound area of Washington, the risk of suicide was 43 per 100,000 person years for those treated with an antidepressant in primary care, compared to zero per 100,000 person years for those treated in primary care without antidepressants. of 35,436 people in the Puget Sound area of Washington, the risk of suicide was 43 per 100,000 person years for those treated with an antidepressant in primary care, compared to zero per 100,000 person years for those treated in primary care without antidepressants. In a 2003 analysis of UK patients treated in primary care for depression (or for other “affective disorder”), Healy and Chris Whitaker concluded that the suicide rate for those taking an SSRI was 3.4 times greater than for those who chose “non-treatment.” than for those who chose “non-treatment.” In a study of 238,963 UK patients who experienced a first episode of depression between 2000 and 2011, researchers reported that there was an increased risk of suicide for those treated with SSRI antidepressants during the first month of treatment, and again in the first month after quitting the drug. This last study reveals why observational studies that simply assess medication use at the time of a suicide may present skewed results. The high-risk period that occurs when someone is withdrawing from an SSRI antidepressant is a risk that comes from going on the drug in the first place, yet, in most observational studies, the suicide is chalked up to the “off medication” column. There are other classes of drugs that may contribute to an increased suicide risk, most notably benzodiazepines. Polypharmacy does so as well. The focus on antidepressants is because this class of drugs is regularly touted, in suicide prevention programs, as protective against suicide. Suicide Prevention at the VA The VA launched a focused suicide prevention effort in 2006, when it appointed a National Suicide Prevention Coordinator. The following year it established a toll-free Veterans Crisis line, and since then it has steadily increased the resources devoted to this effort. One such effort was its Make the Connection campaign, which spent nearly $20 million to market the VHA’s services to veterans. Like the federal government, the VA conceives of “suicide” as a “public health” problem, and thus its bottom-line goal is to increase the identification of mental health problems in the veteran community, and get those who are given a mental health diagnosis into treatment, which is expected to reduce the suicide risk. About two-thirds of the 20 million veterans in the United States do not use Veterans Health Administration (VHA) facilities, and the VA, since it launched its toll-free crisis line in 2007, has regularly conducted mental health awareness campaigns to encourage veterans who are not users of VHA care to seek medical help for such problems. In 2018, it partnered with a pharmaceutical company, Johnson & Johnson, to promote a No Veteran Left Behind campaign, which featured Tom Hanks and urged “each and very American” to reach out to veterans in crisis and help get them “mental health services.” The communications plan, the , was “led by Johnson & Johnson.” Within its own VHA facilities, the VA has introduced mandatory screening for all vets. Every vet entering the system is screened for depression, PTSD, and substance abuse, and all those who are diagnosed with a psychiatric disorder are assessed for “suicidal intent.” Such screening continues to be a regular feature of VHA care, as screening of some type is part of every patient appointment. Every vet is screened annually for depression, and all of those diagnosed with PTSD upon entry into VHA care are rescreened for this disorder for the next five years. With this regular screening in place, the percentage of VHA patients with a mental health diagnosis has increased steadily. In 2014, 41% of VHA-using veterans had a mental health or substance abuse diagnosis (SUD), with depression the most common diagnosis. Although the VA’s most recent suicide reports haven’t detailed this diagnostic information, it can be safely assumed that this steady increase in diagnosis has continued, and that perhaps 44% of VHA-using veterans had a mental health/SUD diagnosis in 2017. Treatment in the VA The VA’s clinical care guidelines for treating depression and PTSD, which are the two most commonly diagnosed psychiatric disorders, recommend SSRI and SNRI antidepressants as first-line therapies. However, the guidelines also provide recommendations for CBT and other psychotherapies, and particularly in primary care settings, these treatments may be offered as stand-alone therapies. Even so, the prescribing of antidepressants to those diagnosed with depression or PTSD is routine. According to a 2015 report by the U.S. General Accounting Office (GAO), 94% of all VHA patients diagnosed with depression from 2009 to 2013 were prescribed an antidepressant. Studies of VHA patients diagnosed with PTSD have reported that about 80% are prescribed a psychiatric medication, with antidepressants the drug class of choice. A 2019 GAO report that sampled prescribing practices at a small number of VHA facilities found that only 15% of diagnosed patients were offered psychotherapy in lieu of drug treatment. Forty-four percent were offered a combination of both, and the remaining 41% drug treatment only. Polypharmacy is also common in VA settings, particularly since half of those diagnosed are said to be co-morbid for two or more mental disorders. The 2019 GAO report found that among those diagnosed with depression who were treated in “primary care and specialty mental health care” settings, 35% percent were taking two classes of psychiatric drugs, and 15% three classes of drugs. The polypharmacy was even more pronounced for those diagnosed with PTSD: 36% were taking two classes of psychiatric drugs and 25% were taking three or more classes. Suicide Rates For VHA Patients As noted above, the suicide rate among veterans has been steadily rising since 2005. The VA, in a 2016 report, divided the VHA-using patients into four subgroups: Undiagnosed and untreated (for a mental health or substance abuse disorder) Undiagnosed and treated (with either a psychiatric drug or non-pharmacologic treatment) Diagnosed and untreated Diagnosed and treated If antidepressants are effective in reducing suicide risk, it is easy to see how the suicide rate for the four groups should stack up. Given that the VHA regularly screens for mental health disorders, the “undiagnosed” patients apparently do not show the symptoms of depression, PTSD or any other psychiatric disorder that, during the screening process, would generate a diagnosis. These patients should be at a low risk of suicide, and thus any treatment to patients in this “undiagnosed” category should-at least in theory-further reduce this risk since it is being prescribed as a balm for whatever ailment is bothering the patients. As such, the undiagnosed/treated patients could be expected to have a lower suicide rate than the undiagnosed/untreated group. The “diagnosed” patients should be at a higher risk of suicide. Suicide prevention efforts focus on getting these patients into treatment, with antidepressants seen as a first-line therapy than can lower the risk of suicide. Thus, if suicide prevention efforts are helpful, the suicide rate for the diagnosed patients who are treated should be lower than for diagnosed patients who, for whatever reason, shun treatment. The 2019 GAO report found that 18% of diagnosed patients did not get treatment. Here are the results. First, those without a diagnosis who got MH treatment were more likely to die by suicide than those without a diagnosis who did not access such treatment. In 2014, those who got treatment died at twice the rate of the “untreated” group. Second, those with a mental health or substance abuse diagnosis who got mental health treatment were roughly twice as likely to die by suicide than those who had a diagnosis but did not access mental health treatment. This difference in suicide rates for the treated and untreated groups is consistent over time, year after year. Moreover, these findings are based on the VA’s review of millions of patient records. The VA published them in a 2016 report that it touted as the “most comprehensive analysis of Veteran suicide in our nation’s history.” There is a third comparison that can be made that leads to a particularly troubling finding. It could be expected that in any comparison between the less severely ill (the undiagnosed) and the more severely ill (the diagnosed), the less severely ill group would have a lower suicide rate, regardless of any treatment effect. Yet, from 2010 to 2014, those without a diagnosis who got MH treatment were more likely to die by suicide than those with a diagnosis who were “untreated.” In sum, the data for the four subgroups tells of a suicide risk in 2014 that increased in this steplike fashion: Undiagnosed/untreated: 24.8 per 100,000 Diagnosed/untreated: 34.4 per 100,000 Undiagnosed/treated : 47.6 per 100,000 Diagnosed/treated: 68.2 per 100,000 This is a finding that runs directly counter to the variable suicide rates that would be expected if “treatment” for a mental health condition were effective. However, it is a finding consistent with RCT data showing that antidepressants double the risk of suicide compared to placebo. In the comparison for “diagnosed” patients, that is precisely the increase seen in the treated group. The Perils of Diagnosis The suicide rates for those diagnosed with a mental health or substance abuse diagnosis have remained stable since 2005. Year after year they have hovered around 70 per 100,000 population. The reason that the suicide rates for VHA patients have been rising is that the VA’s suicide prevention efforts-the outreach campaigns and the mandatory screening-have led to a steady increase in the number of veterans diagnosed and treated for those disorders, and as the VA’s subgroup data shows, this moves patients into a category that has the highes t suicide rate. This gets to the mathematical heart of why the VA suicide prevention efforts have failed. The effort is based on the premise that screening will get patients into treatment that will lower their suicide risk, thus lowering the overall suicide rates. But the subgroup data shows that treatment is elevating the risk of suicide, which leads to this tragic equation: screening + drug treatment = increase in veteran suicides. The rise in suicide among younger veterans reveals, with great clarity, the perils of screening that leads to treatment with psychiatric drugs. The VA, as part of its suicide prevention strategy, has sought to bring younger veterans newly discharged from the military into VHA care so that an initial screening for depression, PTSD, and other psychiatric disorders can be done, and their suicide risk assessed. There were 1.2 million veterans who entered VHA care from 2002 to 2015, and 58% of those veterans were given a psychiatric diagnosis. With this screening protocol and outreach in place, the number of 18-to-34-year-old veterans who received VHA care grew rapidly from 2005 to 2017, from 344,938 to 687,936. Given that 58% of veterans entering care during this period were given a psychiatric diagnosis, it is reasonable to conclude that a high percentage of this population were prescribed antidepressants and other psychiatric medications, which means they were ushered into the “diagnosed and treated” subgroup with the highest risk of suicide. The suicide numbers tell the tragic result. The suicide rate for VHA-using veterans 18-to-34-years-old rose from 20.9 per 100,000 in 2005 to 51 per 100,000 in 2017, a 144% increase. Moreover, because the number of VHA patients in this age category grew as well, the total number of suicides by younger veterans in VHA care jumped from 68 in 2005 to 351 in 2017, a five-fold increase. Veteran Suicides Outside VHA Care The VHA is understood to provide care to veterans with greater social and medical difficulties than those veterans who receive care outside the VHA. Yet, in non-VHA settings there are similar pressures to screen for depression and prescribe antidepressants to those who are so diagnosed. If such pressures are driving up suicide rates for VHA-using veterans, then it could be expected that they would do so in general medical settings as well. The VA’s 2019 report documents that this is so. The suicide rate for all non-VHA using veterans rose from 24.3 per 100,000 in 2005, to 33.9 per 100,000 in 2017, a 40% increase. And just like in the case of young VHA-using veterans, the increase in the suicide rate among veterans ages 18 to 34 has been particularly pronounced in non-VHA settings, rising from 26.2 per 100,000 in 2005 to 41.1 per 100,000 in 2017, a 57% increase. There is one other noteworthy element in this last comparison of suicide rates in the two settings. At the start of the VA’s suicide prevention efforts, the suicide rate was lower in VHA settings than in non-VHA settings for the 18–34 age group. Yet, since 2005, the increase in the suicide rate for young veterans in VHA facilities, where screening is imposed with a military rigor, has been much greater than in non-VHA settings. In 2017, the rate was 24% higher for VHA-using young veterans. This may be one more data point telling of the harm done from the VA’s suicide prevention efforts. The Puzzle Pieces Fit Together The rising suicide rate among veterans since 2005, which is when the VA launched its focused suicide prevention efforts, simply serves as a starting point for an investigation: is there evidence that could explain why such efforts would drive suicide rates upwards? In this case, there is evidence of many types that fit together into a coherent whole. Specifically: Efforts to improve mental health programs and increase access to psychiatric drugs in global settings have led to increased suicide rates. RCTs have found that antidepressants increase the risk of suicide. Observational studies of depressed patients treated in primary care have found that those who take antidepressants have a higher rate of suicide during follow-up periods The VA reports tell of an increasing number of veterans exposed to antidepressants and other psychiatric drugs. In the VA’s subgroup findings, treated patients in both the diagnosed and undiagnosed comparisons have much higher suicide rates than the untreated patients. In the VA’s subgroup findings, from 2010 forward, treated patients without a diagnosis had a higher suicide rate than untreated patients with a diagnosis. As can be seen, there is in this review a collection of evidence that tells of psychiatric treatment that increases the risk of suicide, with the RCT data for antidepressants at the center of this “evidence base.” The VA’s suicide reports tell of an increasing number of veterans diagnosed and treated with antidepressants, which could be expected to produce a rising suicide rate, and that is precisely what has occurred since 2005, when the VA initiated its focused suicide prevention efforts. The VA’s Explanations for the Rising Suicide Rates The VA reports tell of how suicide rates have risen for all veterans, both for those who use VHA care and those who do not. The VA offers several explanations for this rise in suicide in both settings. First, it notes that this rise has occurred against a backdrop of rising suicide rates in the general public, and thus is not limited to the VA. Second, the VA explains that the suicide rate is higher for VHA-using veterans because, as a group, they suffer from a broader range of ills than the non-VHA-using veterans, such as physical illness, combat injuries, fewer financial resources, and homelessness. Third, the reports cite research comparing the two medical systems that found that “engagement in VHA mental services was associated with decreased rates of suicidal ideation and suicide attempts.” All of that may be true. But none of these explanations address the basic question investigated in this MIA Report: Do suicide prevention efforts that focus on screening for depression and other mental health disorders, with those so diagnosed then regularly treated with antidepressants, increase suicide rates? If so, they could be expected to increase the suicide rate in the general public, in veterans that use non-VHA medical facilities, and in VHA-using veterans. And that is precisely what has occurred since suicide prevention efforts have been introduced into general medical settings, and into VHA care. Harm Done There have been more than 70,000 suicides by veterans since 2006, when the VA launched its suicide prevention efforts. It is impossible to calculate the extent of the harm done that is evident in this review of the rising suicide rates in this community of 20 million veterans. However, it is easy to conclude that if suicide rates had remained stable at the rate they were in 2006, there would have been 10,000 to 15,000 fewer suicides among our veterans. That is a number greater than the total of all combat deaths since 9/11. All of this begs for further investigation by the U.S. Department of Veterans Affairs.
https://medium.com/mad-in-america/screening-drug-treatment-increase-in-veteran-suicides-39f7599036c4
['Mad In America']
2019-12-23 22:30:20.676000+00:00
['Suicide', 'Mental Health', 'Psychiatric Drugs', 'Investigative Journalism', 'Veterans']
Adding support for Arabic and Hebrew languages on Airbnb
What makes this noteworthy? Airbnb’s mission is to create a world where anyone can belong anywhere, and with this release, we enable more than 300 million Arabic- and Hebrew-speakers globally to belong on Airbnb. Arabic and Hebrew are both languages that use a right-to-left (RTL) writing direction. The start of the page in a RTL layout is the right-hand side. All text and visual elements are aligned to the right and flow towards the left. On a Right-to-left layout, the information flows from right to the left (Figure 1) We begin our process in the same way as adding any other language: translating all of our strings, defining date formats, number formats, and linguistic rules (e.g. pluralization). These are localization sensitivities. But as soon as we introduce a right-to-left layout, the scope of our work increases drastically. These are visual, functional, and behavioral sensitivities. Visual and naming considerations Let’s look at some examples. It’s unclear which star is the first star in the following interactive star rating component. Interactive star rating component (Figure 2) For us to know which star is first, we must know the page direction. For languages that are written left-to-right, the first element is the leftmost; for languages that are written right-to-left, the first element is the rightmost. Left-to-right: Left-to-right, 1 star selected (Figure 3) Right-to-left: Right-to-left, 1 star selected (Figure 4) Individual icons must be aware of the layout direction too. Directional icons (Figure 5) The names for these arrow icons are iconArrowPrevious and iconArrowNext rather than iconArrowLeft and iconArrowRight. Throughout the codebase we avoid referencing a direction explicitly: in variable names, asset names, and style definitions. Instead, we use terminology that represents the direction relative to the content. For example, the icon alongside a button label is the leading icon rather than the left icon. When swiping to advance a photo carousel, one swipes forwards rather than to the right. A more concrete representation of these concepts is the UI component for a home listing card. (Figure 6) The photo carousel advances forwards as appropriate for each direction, which is hinted by the indicator dots. The alignment of the text and of the “save to wish list” heart is mirrored. As we saw earlier, star ratings also flow with the direction of the content. Left-to-right and Right-to-left listing cards, side by side (Figure 6) Developer experience For us to ensure that these design and localization changes would be adopted across all product teams, We automated as much as possible to make it easy and seamless for every page to have RTL support. On web clients, all the layout flipping was integrated into our core UI components and build pipeline so that anytime those components are used to create a page, RTL support is included for free. We also created a way for engineers to easily test the RTL layout. A RTL preview mode was added to our component library explorer, as well as a preview environment for all pages, screens, and templates. We created automatic visual diffs for every UI code change. What’s next Creating an integrated and easy-to-use developer experience is not without its challenges. In subsequent posts, we’ll detail the journey to RTL language support on Web, Android, and iOS. If you have specific questions you’d like us to answer, feel free to comment here and let us know! You can try out Arabic or Hebrew as your default language by visiting https://ar.airbnb.com or https://he.airbnb.com. On new versions of our iOS and Android apps, you can select Arabic and Hebrew as your system language. Welcome أهلا بكم ברוכים הבאים Mati Bot (Lead Android & iOS Engineer) & Yaniv Zimet (Lead Web Engineer) Thanks to: Hyelim Chang (Localization Manager), John Furlong (Localization Program Manager), Salvo Giammarresi (Director of Localization), Dylan Hurd (Blog Post Editor, Software Engineer), Marcella Iulo (Photo Producer), Diego Lagunas (Engineering Manager), Rebecca Lanthorne (Creative Production Lead), Taido Nakajima (Globalization Design), Michael “Byph” Singh (QA Analyst), Sergio Pelino (Localization Operations Manager) and Maja Wichrowska (Software Engineer) Former Colleagues: Pablo Caro (Designer) and Jason Katz-Brown (Software Engineer) In collaboration with PLUS QA: Ido Avraham, Katie Heynderickx, Joud Kashgari, Hikmet Luleci, Michael Manca and Nataliya Pirumova
https://medium.com/airbnb-engineering/adding-support-for-arabic-and-hebrew-languages-on-airbnb-355f35a4e6b7
['Mati Bot']
2019-05-23 19:06:30.128000+00:00
['Mobile', 'Web', 'Localization', 'Design', 'Product']
Exploratory Data Analysis: Baby Steps
Data science is often thought to consist of advanced statistical and machine learning techniques. However, another key component to any data science endeavor is often undervalued or forgotten: exploratory data analysis (EDA). It is a classical and under-utilized approach that helps you quickly build a relationship with the new data. It is always better to explore each data set using multiple exploratory techniques and compare the results. This step aims to understand the dataset, identify the missing values & outliers, if any using visual and quantitative methods to get a sense of the story it tells. It suggests the next logical steps, questions, or areas of research for your project. Steps in Data Exploration and Preprocessing: Identification of variables and data types Analyzing the basic metrics Non-Graphical Univariate Analysis Graphical Univariate Analysis Bivariate Analysis Variable transformations Missing value treatment Outlier treatment Correlation Analysis Dimensionality Reduction I will discuss the first 4 steps in this article and the rest in the upcoming articles. Dataset: To share my understandings and techniques, I know, I’ll take an example of a dataset from a recent Analytics Vidhya Website’s competition — Loan Default Challenge. Let's try to catch hold of a few insights from the data set using EDA. I have used a subset of the original dataset for this analysis. You can download it here. The original dataset can be found here. The sample dataset contains 29 columns and 233155 rows. Variable identification: The very first step in exploratory data analysis is to identify the type of variables in the dataset. Variables are of two types — Numerical and Categorical. They can be further classified as follows: Classification of Variables Once the type of variables is identified, the next step is to identify the Predictor (Inputs) and Target (output) variables. In the above dataset, the numerical variables are, Unique ID, disbursed_amount, asset_cost, ltv, Current_pincode_ID, PERFORM_CNS.SCORE, PERFORM_CNS.SCORE.DESCRIPTION, PRI.NO.OF.ACCTS, PRI.ACTIVE.ACCTS, PRI.OVERDUE.ACCTS, PRI.CURRENT.BALANCE, PRI.SANCTIONED.AMOUNT, PRI.DISBURSED.AMOUNT, NO.OF_INQUIRIES And the categorical variables are, branch_id, supplier_id, manufacturer_id, Date.of.Birth, Employment.Type, DisbursalDate, State_ID, Employee_code_ID, MobileNo_Avl_Flag, Aadhar_flag, PAN_flag, VoterID_flag, Driving_flag, Passport_flag, loan_default The target value is loan_default, and the rest 28 features can be assumed as the predictor variables. The data dictionary with the description of all the variables in the dataset can be found here. Importing Libraries: #importing libraries import pandas as pd import numpy as np import matplotlib as plt import seaborn as sns Pandas library is a data analysis tool used for data manipulation, Numpy for scientific computing, and Matplotlib & Seaborn for data visualization. Importing Dataset: train = pd.read_csv("train.csv") Let’s import the dataset using the read_csv method and assign it to the variable ‘train.’ Identification of data types: The .dtypes method to identify the data type of the variables in the dataset. train.dtypes A snippet of output for the above code Both Date.of.Birth and DisbursalDate are of the object type. We have to convert it to DateTime type during data cleaning. Size of the dataset: We can get the size of the dataset using the .shape method. train.shape Statistical Summary of Numeric Variables: Pandas describe() is used to view some basic statistical details like count, percentiles, mean, std, and maximum value of a data frame or a series of numeric values. As it gives the count of each variable, we can identify the missing values using this method. train.describe() A snippet of output for the above code Non-Graphical Univariate Analysis: To get the count of unique values: The value_counts() method in Pandas returns a series containing the counts of all the unique values in a column. The output will be in descending order so that the first element is the most frequently-occurring element. Let’s apply value counts to the loan_default column. train['loan_default'].value_counts() To get the list & number of unique values: The nunique() function in Pandas returns a series with several distinct observations in a column. train['branch_id'].nunique() Similarly, the unique() function of pandas returns the list of unique values in the dataset. train['branch_id'].unique() Filtering based on Conditions: Datasets can be filtered using different conditions, which can be implemented using logical operators in python. For example, == (double equal to), ≤ (less than or equal to), ≥(greater than or equal to), etc. Let’s apply the same to our dataset and filter out the column which has the Employment.Type as “Salaried” train[(train['Employment.Type'] == "Salaried")] A snippet of output for the above code Now let’s filter out the records based on two conditions using the AND (&) operator. train[(train['Employment.Type'] == "Salaried") & (train['branch_id'] == 100)] A snippet of output for the above code You can try out the same example using the OR operator (|) as well. Finding null values: When we import our dataset from a CSV file, many blank columns are imported as null values into the Data Frame, which can later create problems while operating that data frame. Pandas isnull() method is used to check and manage NULL values in a data frame. train.apply(lambda x: sum(x.isnull()),axis=0) A snippet of output for the above code We can see that there are 7661 missing records in the column ‘Employment.Type’. These missing records should be either deleted or imputed in the data preprocessing stage. I will talk about different ways to handle missing values in detail in my next article. Data Type Conversion using to_datetime() and astype() methods: Pandas astype() method is used to change the data type of a column. to_datetime() method is used to change, particularly to DateTime type. When the data frame is imported from a CSV file, the data type of the columns is set automatically, which many times is not what it actually should have. For example, in the above dataset, Date.of.Birth and DisbursalDate are both set as object type, but they should be DateTime. Example of to_datetime(): train['Date.of.Birth']= pd.to_datetime(train['Date.of.Birth']) Example of astype(): train['ltv'] = train['ltv'] .astype('int64') Graphical Univariate Analysis: Histogram: Histograms are one of the most common graphs used to display numeric data. Histograms two important things we can learn from a histogram: distribution of the data — Whether the data is normally distributed or if it’s skewed (to the left or right) To identify outliers — Extremely low or high values that do not fall near any other data points. Lets plot histogram for the ‘ltv’ feature in our dataset train['ltv'].hist(bins=25) Here, the distribution is skewed to the left. train['asset_cost'].hist(bins=200) The above one is a normal distribution with a few outliers in the right end. Box Plots: A Box Plot is the visual representation of the statistical summary of a given data set. The Summary includes: Minimum First Quartile Median (Second Quartile) Third Quartile Maximum It is also used to identify the outliers in the dataset. Example: print(train.boxplot(column='disbursed_amount')) Here we can see that the mean is around 50000. There are also few outliers at 60000 and 1000000, which should be treated in the preprocessing stage. train.boxplot(column=’disbursed_amount’, by = ‘Employment.Type’) sns.boxplot(x=train['asset_cost']) Count Plots: A count plot can be thought of as a histogram across a categorical, instead of numeric, variable. It is used to find the frequency of each category. sns.countplot(train.loan_default) sns.countplot(train.manufacturer_id) Here we can see that category “86” is dominating over the other categories. These are the basic, initial steps in exploratory data analysis. I wish to cover the rest of the steps in the next few articles. I hope you found this short article helpful.
https://medium.com/towards-artificial-intelligence/exploratory-data-analysis-in-python-ebdf643a33f6
['Swetha Lakshmanan']
2020-11-18 14:54:48.930000+00:00
['Machine Learning', 'Data Science', 'Data Preprocessing', 'Exploratory Data Analysis', 'Data Cleaning']
SQL Window Function — Part I. Explanation with an example using the…
SQL Window Function — Part I Introduction & Overview Types of Window Function in SQL. Image by Author, inspired by the Toptal.com Introduction: A window function performs a Data Analysis calculation across a set of table rows that are somehow related to the current row. Address the comparable type of calculation can be done with an aggregate function that gives a single row or grouped by condition (refer to Figure 1). Window function does not cause rows to become grouped into a single output row. Rows retain their separate identities also able to access more than just the current row of the query result. (refer to Figure 1) Figure 1 — Difference between — Aggregated and Windows function The database used to explain below concepts: Postgres database and Dataset: Available at Github Order_Table.csv Window Function Syntax: Window_Function([All] expression) OVER( [PARTITION BY expression_list] [ORDER BY order_list Row_or_ Range clause] ) Each part of syntax has been explained as follows: Window_Function: There are 3 major types of window_functions Window Aggregated Function: Consist one of the supporting aggregated function i.e. AVG(), COUNT(), MIN(), MAX(), SUM(). Window Aggregated Function 2. Window Ranking Aggregated Function: Consist one of the supporting ranking function i.e. RANK(), DENSE_RANK(), ROW_NUMBER(). Window Ranking Aggregated Function 3. Window Analytical Function: Consist one of the supporting ranking function i.e. FIRST_VALUE(), LAST_VALUE(), NTH_VALUE().
https://medium.com/analytics-vidhya/sql-windows-function-5dc5cbdb1aa
['Ganesh Dhasade']
2020-11-11 12:46:41.347000+00:00
['Sql', 'Data Analysis', 'Data Engineering', 'Data Science', 'Machine Learning']
2020 Year in Review — Apple’s Best Year Ever?
At long last, here we are at the very end of this exceptional and historic year, 2020. We’re not out of the woods yet but Apple is done with 2020. Here are the ten best things from Apple in 2020. 1. The start of the transition to Apple Silicon The M1 chip Without a doubt, Apple’s transition to its own chips for its Mac product line is the most significant moment of 2020 for Apple. Called the M1, this powerful and power-efficient CPU is transformative not only for the Mac but probably for the industry. The performance per watt ratio level achieved by the M1 has never been so profoundly liberating. Looking at the upcoming year is certainly promising for the transition still unfolding on the Mac. As they say, Apple is only getting started with this transition. 2. One More Thing Special Event The M1 chip visual spec summary The “One More Thing” special event was the fourth of a series in 2020 that started with the WWDC conference. It wasn’t the first time Apple used that tag line for a special event. Not all announcements made under the « one more thing » label were as transformative as this year’s announcements. A whole keynote was mandatory for what Apple had to show us. The announcements made will be remembered for years to come. Apple started its transition to its own Apple Silicon with a big bang. With the M1 chip, Apple steered the vessel in a direction where it will be hard for competing platforms to catch up. Good luck to Intel, Microsoft, Qualcomm et al. 3. WWDC 2020 special event WWDC 2020 invite As the pandemic became official and the first wave came in, we knew we were bound to have an unexpected year. By going digital-only, WWDC started a new trend for Apple and probably for the tech industry. As I wrote in “WWDC2020 — My Observations and a Survey of Comments”, WWDC was a great special event marking the start of a new direction for Apple’s Mac product line. The transition to Apple Silicon got official but without much detail. Only later in the year, we got all the details. 4. “Hi, Speed” special event Hi-speed-invite The third digital special event was all about the iPhone 12 lineup release. In my article “The iPhone 12 Moment”, I wrote about the lack of surprises, but the production quality of the event was showing Apple at the top of the hill. Apple Park makes a great filming set and I cannot get enough of it. With this tag line, Apple could have used this event to announce its first crop of Apple Silicon Macs. Instead, the speed references, and there were many of them in the event, was for the 5G network. 5. Product launches and software releases The 2020 Mac product line Many were expecting a challenging year for Apple but they managed to release one of their best product rollouts in recent years. Besides a delayed iPhone 12 launch, COVID-19 didn’t have much apparent impact on the release schedule of products, be it software or hardware. All in all, Apple released meaningful updates to all of its product lines. Some announcements were made via simple press releases; others got their special moments. iOS 14 brings back stability but the iPad will have to wait to get more specific features. Tim Cook’s Apple never ceases to impress when it comes to operational efficiency and how quickly they can adapt to external market conditions. Consider me impressed. 6. Time Flies special event Apple Watch Series 6 Apple started its fall products launch with the Apple Watch annual update (my thoughts about the events). It wasn’t a breakthrough update, yet I decided to upgrade from my Series 4. Of note are the surprise release of iOS 14 and an updated iPad Air. The fourth-generation iPad Air gathered rave reviews for being a powerful iPad at a much lower price than the 11 inches iPad Pro. 7. iOS 13.4 + Mouse Support + Magic Keyboard iPad Magic Keyboard Who knew a “dot release” of iPadOS as more transformative than any significant new revision like iOS 14. It’s precisely what iPadOS 13.4 was for the iPad. At long last, Apple seems determined to push the iPad as a real portable replacement. With iPadOS 13.4 and mouse support, Apple made it possible to add new interaction models without transforming the original iPad design. It’s a tour-de-force. The Magic Keyboard for the iPad was an instant buy for me, and I wrote my thoughts on this transformative accessory. 8. Environmental actions, privacy protection promotion and social measures Apple Store Entrance There are so many reasons why I love Apple as a company. Not only do they release great physical products, but I also share many of their corporate values. Their stance on privacy protection is mandatory to stand against Facebook. Their actions to limit their footprint on the environment are examples for others to follow. Their politics on social inclusion is a stark contrast to American society in general. Finally, I’d like to put COVID-19 measures the company put forward to protect people in general, including not only their employees but their customers. We never feel hesitation from Apple when it comes for Apple to putting human first, business second. It is remarkable. 9. iPad Air with A14 Bionic chip 2020 iPad Air This year’s iPad Air update may look strange when comparing it to the current iPad Pro line. I’m happy to see iPad Pro design language come down in the iPad product line. I’m even more delighted to see that this update should be seen as a signal that Apple has much more in store for the iPad Pro in the future. Otherwise, the iPad Pro line as it is standing now doesn’t make sense. Next year’s release of the iPad Pro should prove to be very interesting. 10. Apple One Launch Apple One Bundle With Apple One service bundle, Apple’s vision of a highly integrated ecosystem comes to fruition. Only a tight integration between hardware, software and services makes it all possible. The best exemplification of this is their Fitness+ service, which depends on the Apple Watch for it to work. I didn’t hesitate to upgrade to Apple One services bundle as it makes me save a few hundred dollars a year in return for Apple Arcade and Fitness+, two services I wasn’t subscribing to before. Speaking of Fitness+, I gave it a try. And I loved it. Honourable mentions for iPhone 12 and the HomePod mini. Make no mistake, the iPhone 12 lineup is massive but it’s mostly everything that was expected. That being said, ProRAW introduction that came with it is a game changer for professional photographers who lean more and more toward the iPhone for their needs. Yes, this is happening, slowly but surely. Ask Canon, Nikon et al. and have a look at their financial statements, you’ll see what I mean. The HomePod mini was a small, oops, a mini surprise. Apple may have finally found a way to move the needle with its HomePod. All-in customers in Apple’s ecosystem will rejoice for sure. I bought two. I’m happy but I’m also being realistic. The HomePod mini is still a Siri speaker. Siri is still… Siri. Are you getting it? Because of that, I don’t think the HomePod mini is a notable addition in 2020 compared to the rest of the crop. Yet, I love them. 🤷🏻‍♂️
https://medium.com/macoclock/2020-year-in-review-apples-best-year-ever-d181ba7ed931
['Numeric Citizen']
2020-12-23 07:18:13.419000+00:00
['Gadgets', 'Technology', 'Tech', '2020', 'Apple']
First, Pluto Was Downgraded, and Now Saturn Is Gone.
First, Pluto Was Downgraded, and Now Saturn Is Gone. A Brief History of Consumer Electronics, Not an Astronomy Lecture. All pictures in this article without source information are from CANVA PRO. My dear reader, I admit I have been a bit clickbaity with the title. I am 100% guilty of the clickbait crime. I am not talking about the planet Saturn, although the title might assume that. I lead you astray. I am talking about the electronic store chain Saturn. The trademark Saturn vanished from Austria. It merged with the brand MediaMarkt. What commercially makes sense in a world of Amazon dominating commerce still leaves me a little bit sentimental. Join me on my journey through tech history from a young boy's consumer perspective from the 70s, growing into a middle-aged man in 2020. I hate talking about me as a middle-aged man. But that is what 46 years old is — an old f…. Picture from https://kurier.at/wirtschaft/marke-saturn-geht-in-mediamarkt-auf/400970021 What is Saturn? A planet, and what else? For the readers who might live outside Austria or even Europe, a bit of an explanation of what Saturn is. On the one hand, it is the sixth planet of our solar system. On the other hand, it is an electronics retail chain founded in 1961 in Hamburg. The first store was set up to sell records. It claimed itself as “the biggest record show in the world.” Yes, records. It was 1961. The days when music was pressed on Vinyl and selling records was a sound business. Like the Virgin stores of Richard Branson. Over time it evolved into an electronic store chain with more than 200 shops in Europe — mainly in german-speaking and their neighboring countries. Saturn belongs to the MediaMarktSaturnRetail group, with more than 1,000 stores in 15 countries employing more than 65,000 people. How the world changes In October, I wanted to go on Saturn’s website to check out the new Apple iPhone's possible delivery dates. The store I used to go to was just a 5 minutes bus ride from my apartment. I got a fix for my electronics addiction at Saturn Stores for more than 25 years. So I wanted to see when the new iPhone is available to order it and pick it up. What a surprise. When I opened the website Saturn.at, I was redirected to MediaMarkt. The simple message: Saturn and MediaMarkt merged — the brand vanished. Oh boy. It really hit me emotionally, and my mind started creating sentimental stories about past events. It inspired me to write this article to overcome my emotional pain of losing a beloved one. Come with me on a journey through my past, and let’s take 10 minutes to revive the crucial milestones in the development of technology nowadays. Picture from https://nat.museum-digital.de/index.php?t=objekt&oges=209576&cachesLoaded=true The 80s — VC 20, C64 and Amigas plus CDs I was drawn to technology early on. A neighbor owned a Commodore VC 20. I saw it and was hooked. The first computer game I played was on this computer. Frogger was most likely its name. Where did we get the computer and games from? I grew up in the Austrian mountains. Traveling to bigger cities didn’t happen very often. Locally we had some larger retail stores. Owned by locals. The retail stores had tiny nerd corners. And sold a bit of tech stuff. This is where we got our games and devices — games on cassettes. Of course, everybody bought it, and nobody copied those games. We were honest kids. The Commodore C64 substituted the VC 20. Hardware and Software were added to the sales mix of regular retail stores. Usually, the employees had no idea what they could be used for. Computers back then didn’t solve any of our everyday life's problems. There were no specialized computer shops. I got my first own C64 in 1986—what fun. One of the first games I bought was Little Computer People. It is based on the story that intelligent life lives inside Computers, and via the game, players can interact with those people — the computer owners need to care for them. It was a predecessor of SIMS. The 80s was also when the Compact Discs (CDs) revolutionized the way data was stored. The first time I saw a compact disc was in the late 80s when a friend showed it off in school. Us normal low-income people in the mountains were still dwelling on cassettes and records. The sales process for all tech was straight forward. Go into a small locally owned retail store — buy — leave — no questions asked. No large electronics retail chains did exist. The early 90s — Windows-Gates Things changed a bit with the inventions and the spread of Windows and Microsoft Office. Personal Computers — Intel Inside and Microsoft on the software side were a bit more complex. Commodore Computers and Apple Computers were closed architecture systems. The user couldn’t change much. Personal Computers were different. Basically, the user could put together the computer in any way he wanted from the hardware side. Sales personnel needed to be more proficient in selling an open architecture than with closed systems. Thus, I saw the first time stores that completely focused on selling computers. In the end, they ruined the computer business of privately-owned retail stores. The reason was quite simple. Nerds like me got proper advice from the experts in the specialized stores. The retail people didn’t know anything. The open architecture was the right thing for us nerds and young jacks-know-it-all. We wanted to play, putting things together and tearing them apart. Having fun when something worked out well and crying giant tears when something broke. We invested every single dime in those adventures. Let’s just remember that before 1993 the Internet did not exist. Doing research was basically buying books, magazines and talking to experts in the shops and friends. Shopping back then was not going into a shop buying and leaving. My friends and I often met at such computer shops to discuss the latest innovation, and sometimes the owner showed us secretly the latest not yet “officially” released tech gadget. Well, at least it was what they were telling. A friend of mine owned such a shop. It was a meet and greet point for the nerds in town. My friend meanwhile works for Microsoft rolling out Cloud services in the Middle East and Asia. It was wonderful days. And Bill Gates dominated with Microsoft and Windows the market — Intel inside. The Mid 90s — University — the Internet and Amazon, DVDs and mobile phones Then three things that absolutely changed the world happened. The first gamechanger: I decided to study the economy at the University of Graz. It really changed my world. At the University, I had the pleasure of getting my own eMail Account. And it was the first time I logged into the internet. Quickly I realized — I need to have internet access at home because I can connect to the whole world. The second gamechanger: The birth of the internet happened in 1991 with a message by Tim Berner-Lee. He was working at CERN. In 1993 CERN announced that the WorldWideWeb — Web 1.0 — should be publicly available free of charge. A real gamechanger. And I do remember those days. Trained at a commercial school, I thought: Why should they do that? How will the internet survive when the owner gives it away for free. Little did I know. The third gamechanger: Quickly, Jeff Bezos realized the potential of the Internet. In 1993 he decided to found an Online Bookstore: Amazon.com. In the light of the fact that the WWW was announced in 1993 to be publicly available in the same year, Jeff Bezos was THE first mover of commercial ideas on the Internet. At least in my opinion. My world back then consisted basically of Austria, for low-income family life was more local than global. The Internet changed that completely. It opened the whole world to me. Initially, the internet was mostly used for eMails, chatting as we did on ICQ — basically like Whatsapp just way earlier. And the first time computer games went Online. I remember the first Online RPGs like Meridian 59 or Everquest. Mobile phones In parallel, communication changed due to the internet and the rise of the mobile phone industry. Up to the early 90s, innovation was rather slow-moving. Computers in the 80s. Television a bit of innovation but nothing game-changing. The mobile industry was the second big hit in the second half of the 90s. Before, we had phones on a cord at home. Then, the mobile industry rolled out mobile phones to everybody. It was glorious days. The first time I got phones for free. I remember that I got phones and contracts for 1 year for free for the entire family — 10 people — me, my partner, our parents, and siblings. The phones were awkward, not coming close to what we have today. But, we could text and talk wherever we were. DVDs-Playstation and the death of cartridges With the internet and the chips' improvement in computers, better data storage possibilities were pressing problems. Also, the DVDs were introduced—basically the same as CDs but with more storage space. It was the rise of gaming systems. In the 80s, I used Commodore. Gaming on Personal Windows-based computers was a bit awkward. Playstation won the game finally in my house. They used CDs and later DVDs and Blu-rays for storage and were optimized for gaming. PCs were merely for work, and games went to the Playstation. Saturn — my love All these innovations internet mobile phones better storage facilities changed the way we communicated and did business entirely. The first change I realized was that the local computer stores vanished. Larger retail chains like Mediamarkt and Saturn (finally I made it there) took over to fulfill the growing demand for computer equipment. I remember well the weekend shopping trips with my long term partner in the second half of the 90s. She was into fashion; I was into tech. The spread of larger shopping centers in Austria made shopping a Saturday event for the whole family. Fashion stores next to toy shops and consumer electronic retail chains like Saturn. Yet, the nerdy community didn`t meet up in the malls anymore. It was more or less getting the needed equipment to pay and go like in the old days in the 80s. But it was great. It was when I saw you the first time and fell in love with you: Saturn. You had everything I ever wanted to have when it comes to consumer electronics. In the 90s, Amazon was just a bookstore. So seeing new gadgets, testing them, buying music CDs, movie DVDs, games, modems, mouses, Monitors, Computers, Playstation — it all happened at the large specialized consumer electronic chains located in shopping malls. The world was perfect and good. And then… The late 90s In the late 90s and the start of a new Millenium — Dell changed the game, Napster destroyed the music industry, and Steve Jobs saved the planet. Dell destroyed it all. Napster did the same. Steve Jobs picked up bits and pieces and put it all new together. It was the first time I remember that the power of the internet and electronics completely has thrown over entire value chains. The Dell case We were used to buying computers in shops. The open architecture made the buying process more complicated. Should I get an NVIDIA graphic card or an AMD one? Which processor is the best? And what monitor is optimized for that? The research took ages. The internet still was young, and no google was around. Search engines were more or less an alphabetical list of links. Information was consumed from paper magazines and talks with rare experts. And still, we couldn’t find the perfect system. The sales personnel at Saturn? They didn’t know jack shit. But they had all the bits and pieces needed for putting a sound gaming system together. Dell changed that. Everything I needed was on their website. I could configure my computer the way I wanted it. When I exceeded a budget, no salesperson gave me an angry look when I took back the 3,000 EUR equipment I couldn’t afford. Stealing hours of his life and purchasing for a few hundred bucks. The website was patient. And it was cheaper than buying at Saturn. So many of my nerdy friends went to buy online from Dell, which cannibalized Saturn's business. A first hit. And I got to know the business term Cannibalisation. Before, I was trained at commercial schools in how to maintain a business. In the second half of the 90s, the innovation train left the station. Innovate, cannibalize yourself or die. The Napster case Another nice invention was MP3 players. I believe every runner knows what I am talking about. In the 80s, we had the walkman—a cassette player with a battery slot and the ability to play a cassette for 20 minutes. The device was huge and tangling from your side. And it was heavy. The Discman replaced it. Same story. But you couldn’t run with it. Why? Because the laserbeam lost track. So for us, active people, we needed something new — reliable. And someone invented the MP3 player. What was that? A simple device with a flash drive. Enough to store 10 to 20 songs. Well, at least it is what I remember without asking Dr. Google. The only thing was: Where did you get the songs for the player? The usual process was to buy a CD and convert it to the MP3 format, which needed a computer, a piece of software, and expertise to use these things. It was not for the ordinary blokes from around the corner. It needed at least a master's degree in computer science and a pocket deep enough to buy an MP3 player, CDs, and convert them. Well, ok, I went a bit off track again. I remember copying CDs to MP3 for my long term partner around 2000 when we moved to Vienna. Then came Napster. Napster was a peer-2-peer network sharing MP3s. The idea was you buy a CD and convert it to MP3 and When you are online, you share your library and you get access to other users library You give and receive. Napster was another big game-changer by accident. The hidden asset was making people working together and sharing their possessions for free. A twist in the economy rarely anybody talks about. A business model that needed some work but paved the way to subscriptions. The downside was, the founders of Napster destroyed the music industry's value chain — willingly or by accident — who knows. The license terms back then stated that the owner of a CD might only use it for private purposes and make a limited number of copies if the original got destroyed, but for personal use only. The terms didn’t include giving away the music for free via Napster. Napsters model was borderline. Although nobody made money with it, it destroyed the commercial model of the music industry. So quickly, the industry went down to business with the Napster guys. And Napster went down the outlaw route. Steve Jobs was needed to save the day. But the idea of changing the industry was laid out. Steve Jobs was always good at giving inventions a final cut. Making the already great excellent. Apple, in my eyes, is not very innovative. I can’t remember an invention that Apple did by itself, but I remember many inventions that became famous through the apple polishing. MP3 players and peer-2-peer sharing were the first polishing acts of Apple in the new Millenium. Apple invented the Apple iPod. Before that, MP3 players could hold 10–20 songs. That's a lot of work for the user to delete and upload songs to the player constantly. What was the unique twist Steve Jobs gave it: Apple iPod — 1,000 songs in your pocket What a unique selling proposition. Do you want to hear the grandmaster of putting ZEN down to business? That’s a sound unique selling proposition. Gaming — one of my addictions The gaming industry discovered the Internet in the 90s. Shooter and Role Playing games went online. Ghost Recon, Rainbow Six, Meridian 59, or Everquest. The gaming industry connected people all over the world. The community back then was unique. Accessing the internet was expensive, technically complex; the language was mostly English, and it needed a stable internet connection. A stable connection was not a given back in the days. It did cost a solid amount of money per month. In Austria, the cable television providers were the first ones who opened their fiber networks to private customers. It was expensive but a necessary addition for avid gamers like me. It provided the necessary bandwidth. I remember well the nights I spent online playing with Korean and American students. Chatting happened via text—a unique chance to practice English daily and practice escapism. Saturn and I In the early days of the new Millenium Saturn — my personal electronics dealer had it all. Compact discs for Music and Blue rays. Yes, even though Napster was around, I stuck to buying Music and Movies. Why? Finding a good quality MP3 file took hours and hours. I also finished my master's degree in economics and started working, first in research, then in M&A in big corporates. Buying movies and music supported the artists and made sure I have great quality recordings at hand. This was an additional reason that kept me from buying an Apple iPod. Converting CDs to MP3 was work. It took time and felt like stealing. So I kept my habit of listening to music on the computer via the disc driven. Saturn provided everything I needed: a computer, mobile phones, CDs, Games — everything needed. My relationship with Saturn was thriving. Every weekend a new gadget — and yes, I burned money and time. But it was fun. 2005 and beyond — the Apple decade Between 2005–2010 again, a lot of things happened that changed our world forever. And luckily for the better. Steve Jobs did it again — iPod Nano. One impressive thing — Steve Jobs never stopped improving inventions. In September 2005, he introduced the iPod Nano to the world. And this product really got me. It was not the big and heavy iPod from 2001. It was a small device with a huge data storage. But it didn’t take away the problem of getting MP3s. This was when I discovered the second invention of Apple. Steve Jobs had taken over the entire music industry value chain. How? The problem I was facing was still converting CDs into MP3s. On top of that, I hated having to buy an entire disc when mostly 1 or 2 songs were appealing to me. While I saw the iPod Nano coming to Saturn, I discovered the iTunes Store at the same time. In 2003, Steve Jobs had already introduced the option of buying single songs from an Album for 0,99 ct. And together with the iPod Nano, it was a real gamechanger in my world. Why? I loved sports and mostly wanted to listen to music while doing sports for distraction and motivation. Apple solved this problem — iPod Nano and the 0,99ct per song option at the iTunes Store. Steve hit a second time — Smartphones. And he did it again. The mobile industry was thriving. Since I got my first mobile in 1997, Austria's industry gave so many mobiles away for free that every person on average owned 2–3 devices. My apartment was full of old phones. I couldn’t sell them on eBay. Due to the high innovation speed and the “For Free” Model, they became worthless the minute I got one. Well, I hadn’t to pay for anything in the first place. The mobile industry tried to figure out how to take the “for free” model back. It was burning money. Most customers didn’t generate sufficient revenues via their monthly subscriptions to justify giving away phones for free. Additionally, traditional mobile phones all had one problem. They came with closed software architecture. Every time I got a new phone, I had to spend a day learning how it was set up — transferring the phonebook from an old phone to a new one. This often didn’t work because the systems were not compatible, and the Simcards couldn’t store many contacts. External storage? Well, not back then. So a lot of things were still unsolved. In that market, with exactly these kinds of problems, Master Steve did something amazing. He provided the next solution: The iPhone 1.0 I really didn’t get how much potential this innovation had. I remember it well when I ordered the first iPhone. Back in 2007, I was running the financial department of a life science company. The CEO, Rodger Novak (meanwhile CEO of CRISPR Therapeutics), wanted me to order an iPhone. I grinned and said: “Why do you need an iPod with a telephone function? Ordering a regular mobile phone and an iPod is much cheaper than going for an iPhone.” Well, Rodger was the CEO, so of course, I obeyed with a smile on my face. It took a few years until I really understood the value of the iPhone. The first one came without 3G. 3G was back then what 5G is in 2020. I got hooked on iPhones with the iPhone 3G. It gave me the possibility to browse the web when I was commuting. Of course, by subway — not in the car. And the gamechanger was the absolute power of the user to personalize the iPhone 100%. And it was transferable to new iPhones. Gone were the days of being dependent on the technicians designing a phone. Today in 2020, when I buy a new iPhone, it configures exactly in the same way as my old iPhone. And I don’t have to do anything. Thank you, Steve. May God bless you in the next world. You did a great job for society. Netflix went online On the eve of one of the biggest crashes in stock markets' history, Netflix put its business online. Before 2007 Netflix was renting out DVDs via postal service. In 2007 they pivoted to the internet. Distributing movies via the internet. A bold move as the bandwidth in 2007 still was narrow. There was a constant traffic jam on the data highway. Netflix did it right. In 2010 they acquired the streaming rights for five years for films from Paramount and MGM. Exactly at the time when nobody was interested in such rights. But more to that later. Gaming It was still more or less offline in distributing the content. Although the Playstation and other gaming devices went online, the player still needed to buy a DVD or BlueRay to get the game started. The game on the Playstation started from a DVD, and then the player went online to enjoy playing with the community of gamers. Compared to the 90s, the great thing was that patches to the software (the name of apps before Apple introduced the App-Economy) were delivered digitally. No more pushing discs or CDs around. The minute a game was installed, it got the latest bugfixes. But no big innovation right away. Me and Saturn. 2005 to 2010 was the first time since our long-term relationship started in the 90s when I wasn’t stopping by that frequently. We were together for more than a decade, and I started betraying you. Yes, I am guilty of adultery. Saturn served me well. It was me who started trying out other sales channels. Apple’s iPod Nano/iTunes single song combination got me. I stopped buying music CDs at Saturn. And put all my money for music into the iTunes Store. I really didn’t understand that it was the beginning of the end of my love affair with Saturn. It was just a bit of flirting with other channels. Nothing serious. 2010 onwards- the tech train gained speed Oh, my dear reader. If you really think the innovation speed was high since 1974. You are wrong. The innovation speed from 2010 onwards put the world on speed dial until the abrupt halt in 2020 with SARS-CoV-2. Let me tell you how and why. Netflix — not a Jobs Story. Streaming was something that Microsoft and Apple missed entirely, initially. I thought that the two godfathers of innovation lost it’s edge and started aging. Also, Jeff Bezos, with Amazon, was extremely late to the party. The Netflix crew nailed it. I always thought when I could buy music via the internet: Why not movies. And then I discovered Netflix. I believe I got my first subscription back in 2012/13 or something. I am in Austria, Europe. No, with have no Kangaroos. They are in Australia. Before Netflix, movies were delivered in a complex value chain. First, they appeared in Cinemas; then, they went on BlueRay's for purchase and a bit later for rent. Ultimately they made their way to Cable TV. SKY was one of the first ones who played movies. The business model of SKY was a high monthly subscription fee. A few months later, movies were distributed to cable TV. And a few years down the road to free TV. That was the good old movie industry value chain. And then came Netflix. Nobody believed in streaming back in the days before 2010. The internet was laggy, crowded, and slow. Streaming movies needed a stable broadband connection. 3G was not built for that; 4G still young and not entirely rolled out. And fiber networks — well, the internet's backbone was not built for the massive amount of data. Thus, Netflix was extremely adventurous when they got into a billion-dollar deal for Paramount and MGM movies' streaming rights. And the rest is history. Netflix succeeded. The world's movie library went online and got accessible as a subscription-based service. I remember the fun I had to cancel my Sky subscription of 50 EUR/month for the Netflix subscription of about 6 EUR monthly. Sky called me and wanted to convince me to stay. I said to the lady on the phone: “If you give me the sky subscription for 6 EUR, I stay”. She asked, “why?”. I explained, “Netflix is available for 6 EUR/month.” She laughed and replied, “but Netflix doesn’t have the latest movies. We do.” “Yes. True. But I use movies mainly for learning English and Netflix is sufficient for this purpose. And still, when I buy 1 or 2 movies on a Blue-ray, I am saving money. And more than 1 or 2 movies I do not watch in a quarter anyway. So do I get Sky for 6 EUR/month?” No. I didn’t get it. But it didn’t matter at all. Good Bye Steve Steve Jobs died in 2011. One of the greatest men in the history of tech. He shaped our society with his spirit. The 2020 lockdown was digestible because of the Internet (not a Jobs invention), Smartphones, and Tablets. The devices that deliver Apps like Netflix to the Users. In a way that the user doesn’t need a Ph.D. in engineering to access the content. I believe it was Steve Jobs who brought this twist to our society and built the right high performing team at Apple. In my opinion, Steve Jobs was the person who made the Internet accessible for all human beings with a simple to use the device: The iPhone. I believed when Steve Jobs left Apple that the company will go down. With Apple, I always was wrong. I didn’t understand the iPod, the Smartphone, and not that Steve Jobs has built a great company with a thoughtful leadership selection. Tim Cook took over and made Apple the most valuable company in this decade. Tim Cook made the innovation train of Apple never-ending. Apple Watch was one of his inventions that similarly changed the world, as his predecessor Steve Jobs's inventions did. Gaming It didn’t pick up the subscription model yet. But the gaming industry did what Steve Jobs did to the Music industry. They delivered directly to the player via the internet. I really can’t remember when I stopped buying Games on Blue rays. I got a PS4 in 2014. Did I buy games online on the PS3? I can’t remember. Most likely. As Netflix caused heavy traffic on the internet, the internet providers improved bandwidth and speed. And it made downloading 60–100 Gigabytes feasible in a short period of time. Hours, not days or weeks. Unbelievable speed — coming from the early days of the internet with 56 kbit modems. Sony went online also Microsoft. Not online games could be purchased online; also, every software is needed. Many apps (the new name of software) became available for monthly subscription fees. And a novel industry was born: Cloud-based services and data-storage became an entirely new sector. Software went Apps I remember the old MS office. For decades I purchased the CDs/DVDs/Blu-rays, waiting for weeks until the online update process was finished, and I finally could use the new software. Today, I get a monthly subscription. The software is updated with small downloads almost daily, without causing much friction in the user experience. Adobe did the same thing and many other apps. Amazon — the biggest retail store in the world Jeff Bezos started as an Online Bookstore. In 2020 almost every legally available product can be purchased via Amazon. What began as an online bookshop in the 90s evolved into the biggest retail store in the world. No shops and everything delivered to the doorstep. It just needed a fingertip, and you got what you always wanted a few days later delivered. Amazon also started Amazon Web Services, which still is a big growth factor for Amazon in 2020. Me and Saturn. Our relationship that developed over more than a decade got a serious hit. I was attracted to younger sales channels. And well, I admit I started betraying my Saturn regularly. With the rise of Netflix, I stopped buying Movies on BlueRay's at all. I believe the last one I got must have been about 10 years ago. The movies that I couldn’t find on Netflix, I started buying on the Apple Store. Apple also had an online service. No subscription fees, but for about 5 EUR, I could rent a movie for 48 hours. Amazon has the same service. All three together took me away from renting movies at local rental services or purchasing them at Saturn. The same with the gaming industry. The minute Sony started selling games online, I just bought them online without a second thought. It was not a decision based on a lower price—just convenience. Click — buy — download — play. Done. No waiting at the checkout, no frustration because the game is currently sold out. Digital content is always available when you want it. The last serious purchase I made at Saturn was in 2013. I got a new laptop for work for about EUR 1,700. Since then — nothing. Well, I obviously slowly killed my traditional sales channel without me even realizing what I was doing. And most likely, millions of other consumers did the same. Digitalization simply was the more attractive offer. 2015 — new digital eco-systems evolved It was the rise of FAANG, which changed the entire world. Facebook, Amazon, Apple, Netflix, and Alphabet (formerly known as Google) delivered a mesmerizing return on investment in the last five years. Why? Because they evolved business models to entire eco-systems, these innovation engines bring value to the customer in multiple ways. Let’s get a bit of an insight. Apple The biggest criticism of the Apple story on the stock market was that Apple is only a one-product company: It only sells the iPhone. And a tablet which was viewed as an iPhone with a bigger screen. Many analysts warned that declining iPhone sales might hurt the company long term and easily lose its entire value. Besides the iPhone, Apple had nothing else to offer before 2015. Tim Cook changed this story. Let’s just think about the Apple startup — Airpods. Tim Cook went into the crowded earbud space. I love sports. One of the biggest problems was buying reliable headsets. They should be durable and lightweight. Tim Cook brought the solution to the market — cordless Apple AirPods. A masterpiece from version 1 onwards. No matter what kind of sports I did — running or working out at the Gym — they never fell out. A couple of weeks ago, I calculated the Apple Airpod business's value and came to 84 billion dollars. Just imagine you start a company, and 3 years later, it has a value close to 100 billion. Tim Cook brought multiple devices to the consumer that put Apple in the hands (iPhone and Tablet), wrist (Apple Watch), and Ears (Airpods). For the customization of the gadgets, the consumers need to buy Apps. And the amazing thing is — Apple gets a 30% commission from every sale. As the subscription models gain traction, you can easily imagine what that means to Apple's revenue. A customer doesn’t only buy one iPhone every 4–5 years, no in 2020, she regularly purchases Apps at the Appstore and delivers cash to Apple. Apple has about 1,5 billion devices spread out all over the world. Just assume less than a third are active users. It is still the size of the United States population, who regularly buy something via the iPhone- with Apple making its commission. No wonder Apple piles up cash, and Warren Buffett has his largest holdings in Apple. Amazon The next big eco-system that evolved out of the internet economy. Amazon became the world's biggest warehouse and is expanding further on. Like Apple also Jeff Bezos company stays innovative and supporting its customer's needs. By July 2020, Google states that Amazon has more than 1,000,000 employees. Jeff Bezos's net worth exceeds 200 billion USD thanks to the uptake in valuation on the stock market. Microsoft A few years back, I believed Microsoft missed the tech train. It looked like the company got stuck with its old models of selling Windows and Microsoft Office. The direction changed as Microsoft got a new CEO in 2014. Especially Microsoft Azure Cloud Services contributed to the growth of the company. Yet, it’s flagship products are still Windows and Office. The latter is available via a subscription service, which delivers recurring revenues for the company and automatic updates. Microsoft will most likely continue to grow. These three companies significantly changed the way we do business and how we interact. Apple provides the devices, Microsoft, the software, and Amazon drives commerce. Saturn and I broke up. Our relationship lost its edge already 10 years ago. And it didn’t get better since then. First, I started purchasing songs via the internet, then Movies and Games. The devices I got delivered by Amazon — very conveniently to my home or workplace. I didn’t need to move around. It didn't matter anyway. Because the employees at Saturn mostly were no experts in the field, and the variety of tech goods was extremely narrow, compared to Amazon's Marketplace. The necessary research needed for making a purchase decision, I did anyways via the internet. What started with a hot and intense love relationship in the 90s with weekly visits for many hours withered to one or two visits per year. The reason why I didn’t give up visiting? Sometimes a dishwasher, fridge, vacuum cleaner, or washing machine broke, and I took advantage of special offers. Also, light bulbs or batteries were such convenience goods that I purchased at Saturn. No larger purchases and not very frequently. From leaving a few thousand Euros at Saturn annually down to 30–40 Euros. Maximum a few hundreds. I tend to save my money on white technology purchases. 2020- the Pandemic hit My consumer behavior has changed dramatically since the 80s. From socializing at tech stores over buy and run at the larger retail store chains, to entirely purchasing everything online — physical and digital goods. In the first quarter of 2020, the pandemic changed the world entirely. Europe and the United States went on lockdown, driving more and more consumers to online purchases. It drives the Eco-System of Amazon, Apple, or Microsoft to unbelievable valuations on the stock market. No wonder that the traditional brand owners give up on their brand names like Saturn. Consumers in the new decade are buying more and more online. The political decisions around the pandemic pushed the consumer's behavior further in this direction. Saturn and I Yes, I am sentimental. I had great times at Saturn. Often I stood like an adult kid before novel technology — computers, gaming consoles, television sets, or mobile phones mesmerized me. But let’s face it. The old brands lost their connection to the consumer — already a long time ago, they gave up their unique position to the upcoming servants of Amazon, Apple, or Microsoft. What their unique position was? Direct face to face access to the consumer.
https://medium.com/datadriveninvestor/first-pluto-was-downgraded-and-now-saturn-is-gone-231ccaecdd0e
['Christian Soschner']
2020-11-22 08:50:32.403000+00:00
['Microsoft', 'Saturn', 'Consumer Electronics', 'Life Lessons', 'Apple']
Where the Roll Unravels
Photo by Frida Bredesen on Unsplash Hey there, fearsome followers of Out of Ideas, Out of Time. The title of our little pub has never been more fitting. We’ve left Stark hanging like an empty cardboard tube, can anyone step up and hasten the inevitable end to this tale? If you’d like a shot at fame and notoriety, no need to feed your mate to a hungry tiger, just sign up for one more chapter of The Toilet Paper Caper. But I understand if you’re so stressed out by the current crisis that all you can manage is lying in bed watching Tik-Tok videos. In that case, please take a moment to comment with your suggestions for the story here — Where the Roll Unravels and I’ll polish up the last chapter, including every prompt. No suggestion is too silly, here is the chance to see your ideas come to life with less effort than it takes to order grocery delivery from Amazon. As a refresher, you can find all the chapters here — Toilet Paper Caper
https://medium.com/out-of-ideas-out-of-time/where-the-roll-unravels-a6fa3fc37fb8
['Terrye Turpin']
2020-04-17 22:29:41.525000+00:00
['Writing', 'Fiction', 'Toilet Paper Caper', 'Writing Prompts']
A Guide to a More Meaningful Life
A Guide to a More Meaningful Life Forget happiness, the path to satisfaction is meaning. Photo by Joshua Rawson-Harris on Unsplash Despite life getting increasingly better by every conceivable standard, here in 2020, more people feel alone, depressed and hopeless than ever before. To figure out why, we need to ask ourselves: what constitutes a good life? And what makes your life worthwhile? Some of the worlds greatest thinkers have been grappling with this question for years: In 340BC, Aristotle identified the good life as aligning with living virtuously. And, around 290BC, Epicurus identified the good life with finding happiness and inner tranquillity. Most of us intuitively agree with Epicurus — attributing the good life as synonymous with a happy life. In fact, our culture is obsessed with happiness, and we do everything in our power to increase our own. Somewhat like utilitarians, we measure how well our life has gone by how happy we are. But, as it turns out, what determines how well we feel our lives have gone has nothing to do with happiness. In fact, according to Psychologists; what determines feelings of despair isn’t a lack of happiness. Rather, it’s a lack of meaning and purpose in life. It’s clear, then, that pursuing purpose and meaning over happiness, is a more fulfilling path. Research even shows that people who have a sense of purpose are: more resilient, do better at work, and live longer. So, how can each of us find meaning?
https://jon-hawkins.medium.com/a-guide-to-a-more-meaningful-life-85ece35eaf51
['Jon Hawkins']
2020-06-26 11:01:11.193000+00:00
['Self', 'Life Lessons', 'Psychology', 'Life', 'Self Improvement']
Enable External Access to Confluent Kafka on Kubernetes — Step by Step
Enable External Access to Confluent Kafka on Kubernetes — Step by Step Sumant Rana Follow May 9 · 4 min read This article describes the configuration required to be able to access Kafka on Kubernetes from outside the Kubernetes cluster. Pre-requisites: Kafka installed on GKE (or any other K8s cluster) via cp-helm-charts using installation name <installation-name> The cp-helm-charts repository has been cloned locally and is available at the path <cp-helm-charts>. Kafkakat or any other Kafka client is available for testing and validating the changes in configuration and access to Kafka on the cluster. Instructions for downloading and installing Kafkakat can be found here. Default configuration: If we install Kafka using the default configuration provided by cp-helm-charts and try to connect using Kafkakat, we would not find an entry point to the cluster. This is because by default there is no external IP Address enabled on which we can connect with the broker. This is a sample output of the get svc command run on a cluster installed with Kafka: The output of ‘kubectl get svc’ command on the default namespace As we see here, all the default services are of type ClusterIP and External-IP is set to <none> in each, there is no way we can reach any of these services from outside the cluster. Configuration Changes: There are 2 configuration changes required in order to successfully connect to Kafka broker running inside the Kubernetes cluster. For the sake of explanation, I have done the changes in 2 steps but they can be done together in order to save time : Enable Kafka Broker to be contacted from outside the cluster Switch to the Kafka charts directory under the <cp-helm-charts> directory cd <cp-helm-charts>/charts/cp-kafka Edit the file values.yaml and change the value of nodeport -> enabled to true from original false . nodeport: enabled: true Leave the other 2 properties servicePort and firstListenerPort as-is. servicePort: 19092 firstListenerPort: 31090 Apply these changes to the existing Kafka installation on the cluster using helm upgrade cd <cp-helm-charts> helm upgrade <instllation-name> . When applied successfully this configuration will create a node port service on the cluster that can be used to connect clients to the broker. This is a sample output of the get svc command run on the same cluster after making this configuration change: The output of ‘kubectl get svc’ command on the default namespace As we see in the output, there is a new node port service. In order to connect to that service, we can use the external IP of any of the nodes that form the cluster. Use the following command to get the external IP of the nodes and choose any one IP address from the list. kubectl get nodes -o wide From now on, <ExternalIP> refers to the chosen external IP. If we try to connect to the broker using Kafkacat client on <ExternalIP>:31090, it will show the following output: > kafkacat -L -b <ExternalIP>:31090 Metadata for all topics (from broker -1: <ExternalIP>:31090/bootstrap): 1 brokers: broker 0 at <InternalIP>:31090 (controller) 49 topics: .....(details of topics) According to the output, everything seems to be correct and Kafka broker seems to be responding correctly with the details of the topics. But, there is a catch here. The metadata that the broker is returning to the clients (which will be used for future communication) indicates an IP address that is internal to the cluster (because we have not yet configured external listeners). broker 0 at <InternalIP>:31090 (controller) so when the client sends a request to publish a message to the broker, it is not able to communicate with this internal IP address and publishing of message fails (even though we see a successful connection with the broker). In order to fully enable external access, we need to make one more configuration change as described in the next step. Configure Kafka Broker to return the correct metadata to enable further communication with the clients. Switch to the Kafka charts directory under the <cp-helm-charts> directory cd <cp-helm-charts>/charts/cp-kafka Edit the file values.yaml and enable external listeners "advertised.listeners": |- EXTERNAL://<ExternalIP>:31090 Make sure the alignment of text is correct (A single space before EXTERNAL) otherwise the following error is shown when deploying the chart: Error: error unpacking cp-kafka in cp-helm-charts: cannot load values.yaml: error converting YAML to JSON: yaml: line 52: could not find expected ‘:’ This will append an external listener to the list of internal listeners in the final configuration. Apply these changes to the existing Kafka installation on the cluster using helm upgrade cd <cp-helm-charts> helm upgrade <instllation-name> . If we now try to connect to the broker using Kafkacat client on <ExternalIP>:31090, it will show the following output: > kafkacat -L -b 35.198.199.180:31090 Metadata for all topics (from broker 0: <ExternalIP>:31090/0): 1 brokers: broker 0 at <ExternalIP>:31090 (controller) 49 topics: .....(details of topics) If we see the output shows the controller address with an external IP rather than the initial internal IP. Once done, the clients can not only successfully fetch metadata from the broker in the cluster, but they can also publish and subscribe to messages.
https://medium.com/swlh/enable-external-access-to-confluent-kafka-on-kubernetes-step-by-step-e4647ca7a927
['Sumant Rana']
2020-06-07 15:33:09.934000+00:00
['Kubernetes', 'Helm', 'Kafka']
Best Websites to Make Money in 2021
Best Websites to Make Money in 2021 If you want to earn extra cash, turn to these money-making websites. Internet is the best place for people to make money where you don’t need a degree or experience, and the best thing is there is no age limit. If you are a student, housewife, or even jobless, the Internet has opened doors for all of you. You just need a laptop or a smartphone. To earn passive income, you have to keep patience and be passionate. In starting, you will face some obstacles, but if you don’t give up, then you will end up making thousands of dollars per month. Photo by iaroslavbrylov on freepik As you know, 2021 is coming now. Everyone is trying to find legit ways to earn money online. There are four legit ways or websites with which you can generate passive money, 1. YouTube (Ad Revenue): If you can create unique content regarding entertainment, gaming, comedy skits, teaching, reviewing, and much more, then you can be the next self-made star or celebrity who earns thousands of dollars. You just need a creative mind and unique content uploaded on your respective channel. To generate money on YouTube, you have to take a look at eligibility criteria to get paid by YouTube. Eligibility criteria: Your channel must have 1,000 subscribers in the last 12 months. Your channel should have 4,000 public watch hours in the past 12 months. Photo by freepik on freepik After completing these two above thresholds, you have to apply for the YouTube Partner Program (YPP). The YouTube team will review your channel then. If your channel complies with YouTube monetization policies, YouTube will turn on its monetization. Now, YouTube will place ads on your uploaded videos, and you will get paid when ‘one single’ viewer views that ad. There is not only one way to earn money on YouTube, but there are so many ways you can earn passive income from YouTube. 2. ClickBank (Affiliate Marketing): You must be aware of the term “Affiliate Marketing.” It is a process by which an affiliate earns a commission for marketing another person’s or a company’s product. It is the favorite stream of every YouTube and Blogger to generate passive income. What is ClickBank? ClickBank is one of the best affiliate marketing system and a leading global retailer, founded in 1998. According to a search, ClickBank has over 6 million digital products in over 20 categories like e-books, videos, and software. If you are a vendor, you can create your product and set it up using ClickBank’s system. ClickBank approximately reaches 200 million people. Among affiliate marketer, ClickBank is a popular choice. Screenshot by Author What about commissions and payouts? If we talk about commissions, ClickBank offers very high commission, as high as 75%. Many online users love to work with ClickBank because you can withdraw your commission regularly. You can take money out whenever you want. Is ClickBank free to use? Yes, anyone can join their affiliate marketing system. Whenever you find a digital product that you want to promote, you can get a unique link from ClickBank that you can use to land customers directly to the product page. If they like that product, they will buy it, and you will get a high commission. 3. Upwork (Freelancing): Upwork is one of the most top-ranked freelancing platforms. What is Upwork? Upwork is a global platform where businesses and freelancers from all over the world can connect and collaborate on jobs. Screenshot by Author How does Upwork work? Clients post jobs that he wants to outsource, and the freelancers can bid on it and complete the work. After completing the client’s want, freelancers get paid. Millions of jobs are being posted on Upwork every year, and more and more people are joining every day. Upwork is open to all firms, agencies, professionals, and freelancers. So, a virtual assistant living in Canada can provide services to a company in Australia. Similarly, a web developer from the USA can create a website for an individual in India. How can you make money on Upwork? If you are an expert in content creation, writing, website development, graphic designing, video editing, translating, or you can provide SEO services, digital and social media marketing services, then you can earn big bucks, hard cash, or real money. 4. Amazon (Selling): Sell those products which you no longer need and earn passive income. What is Amazon Seller Central? Amazon Seller Central is the backend tool for Amazon individuals to list products and monitor their profits on Amazon. Screenshot by Author How can you sell your stuff on Amazon? To sell your products on Amazon, you have to sign up on Amazon Seller Central. You can sell anything that you want through Amazon Seller Central. The best way to get started is that, if you have kinds of stuff that you no longer need, sell them. You can sell used books, home appliances, or even old collectibles. You can even earn credit through Amazon trade-in by trading-in unwanted products. Where can you find products to sell through Amazon Seller Central? There are many ways to find products to sell on Amazon. Some of them are,
https://medium.com/business-teaching/best-websites-to-make-money-in-2021-6a2fc876e949
['Aliyan Shaikh']
2020-12-29 09:57:19.684000+00:00
['Freelancing', 'Earn Money Online', 'Amazon', 'YouTube', 'Earn Money From Home']
Numb
Numb Prose Photo by Konstantine Trundayev on Unsplash You’d know you hit rock bottom when the people around you start to vanish. One by one they leave, as your social circle becomes smaller by the second even your tears leave you too. What’s left is someone brimming with sorrow yet with no tears to cry only to wallow waist-deep in negativity yet trying so hard to fight on. I always believe that NO ONE in this world is weak or strong. We all fight our share of battles, but some of us grow tired of fighting wars we know deep down we’ll never win So we let them get the best of us. And like that, we grow tired. We grow tired of societal expectations. We grow tired of the piles of workload. We grow tired of donning fake smiles. We grow tired of making an effort to dress and look presentable. And when all around us makes us tired, we fall into the worse stage of being numb where emotions are left unfelt And people are left uncared for. We start to become encompassed by more negativity maybe start to bring harm to the question we neglect everything, everyone, and most importantly, ourselves. We look into the mirror see not us but someone rotten of emotions numb of life. The worse part? Nobody knows about us. Nobody knows our anguish , nor do they check upon us. And in our attempt to share with them our troubles, only to be replyed with dramatic or exaggeration. Depression is scarily widespread today. It is heart-wrenching to know. CHILDREN now already falling into the abyss of depression. And when they reach out to parents and friends with a fixed mindset and heedless thoughts they question everything they question whether they are loved or not they question favoritism and whether the earth truly has a space for them. And as they jump from balconies or harm themselves in clandestine who are we to refuse that the upcoming generations might only mature to be sadder? Depression is named the silent killer for a reason. and none of us should neglect that or anyone suffering from it. The stigma surrounding mental illness should be eradicated as soon as possible. Because I love you. We love you. If you decide to let depression get the best of you, who would take care of your kitten back home? Who would assure the grieving parents that go to bed crying every night? Who would be here to look at the beauty of the world, to taste the tastiest of waffles, see the brightest of days, to see the person who actually loves you? because mark my words. We all love you. You are never alone, and we got your back. Hang in there, please! You may go to bed sad every night, may not be loved widely, but at least you’re loved DEEPLY. By me.
https://medium.com/know-thyself-heal-thyself/numb-ad34d9cb9f9d
['Daniel A. Teo']
2020-12-10 17:58:13.161000+00:00
['Prose', 'Mental Health', 'Happiness', 'Depression', 'Self Improvement']
Weight Loss Update: Accountability is The Key to Staying on Track
When you're honest, vulnerable, and willing to share your weaknesses, it turns out that’s the best thing you can do for yourself. Actually, it’s all in the sharing of said weaknesses that’s best. Once you open your mouth and share with others about what you’re struggling with and how you plan to improve your situation, as I did here when I committed to experimenting with new ways to lose weight, you become strongly committed to following through, because no one wants to be called out, or pegged as the one who is always complaining yet doing nothing about changing things up. (Don’t be that person.) I definitely re-learned this over the last three weeks, and in case you’ve been following along on my weight loss experimentation, here are my updates on what I’ve learned, and what has served me well so that perhaps you may begin your own experiments. 1. You guys are incredible and you don’t even know it! That’s because I now feel accountable to all of you — and even though it’s a mystery as to who actually reads my dribble, it’s enough to keep me honest — even if it’s one person. For example, over the weekend (which I crushed by the way!) I was sitting on the sofa contemplating what to eat next — I was well within my intermittent fasting hours of eating so that’s all on the up and up — and I hadn’t worked out yet. Now, to be honest, I was close to bagging the workout because it was an overcast, drizzly kind of day, and what better way to spend that time than curled up on the sofa watching football, or a good movie? Typical move, total no-brainer. Usually. But then I thought of you — yes ‘you’ who is reading this now — and I knew I shared with you my plan, plus I committed to getting those darn star stickers over the weekend, which meant I had to workout. I simply had to. So, with an imaginary nudge from you, I got up and went to workout. And you know what? Of course you know — I felt amazing afterward. I was so damn proud of my little self that I marched right up to my office and put a sticker on my calendar (I swear, it’s the little things that make life great)! Holding yourself accountable to others can be the motivation and the discipline you need that finally gets you doing what needs to be done. Find yourself an accountability partner and woman up! 2. I’ve realized this one true, indisputable fact: You have to want to change more than you want to stay the same. Staying the same is easy — but that doesn’t mean it’s better — and change is oh so hard! It requires daily reinforcement to stick to the path you’d, at some point, want to jump from. (This is where accountability is ginormously beneficial. GI-normously!) If you don’t have an accountability buddy, you’re missing out on quality guilt, which you need if you’ve been struggling. However, and this is super important, you really do have to be fed up with your situation. You have to want to live a better life and you have to know it’s possible — because it is. If you don’t have that fire in your belly, you’re not going to work as hard, and you’ll let the excuses reign supreme. People will line up for a better life, but once they realize the work that they need to put in and the daily discipline that’s required, they back out. They stay the same. They go back to wishing, not doing. You have to really want it. Otherwise, it’s not going to happen. 3. I am loving intermittent fasting. The mere fact that I don’t feel like a waddling penguin anymore is reasoning enough for me to keep this up for-eva! If you struggle with late-night eating, this is your White Knight upon his fiery steed. Even when my husband pulls out the chips, or makes nachos, it’s not a struggle to say no because if it’s after 7:30, my brain knows. Once you make a decision, there is no decision left to make, and that eliminates the struggle. Experiment, and keep it simple. I’m working out harder than before — to the point of sweat dripping off my face — and I’m not loading up on late-night calories. That’s it. Simple. And it’s working. And for that, I thank you, oh mystery reader, for giving me the opportunity to share my weaknesses and allow me the opportunity to right my ship. By the way, I crushed the weekend…
https://medium.com/project-slim-waistline/weight-loss-update-accountability-is-the-key-to-staying-on-track-e0b62442e174
['Am Costanzo']
2020-10-29 13:43:19.801000+00:00
['Self', 'Weight Loss', 'Wellness', 'Fitness', 'Life']
Dear Balance, I Need Your Help with My Emotions
Dear Balance, I hope this message finds you with a smile. If I know you, you’re probably down south learning to surf and making new friends or maybe you found a secret hideaway stacked with books. I’m not sure if you heard, but I wrote a letter to Anger. I know I wanted to eliminate it from my life after the way they behaved. But you were smart and blocked me with your calm demeanor, the natural opposite of my fury at the time. With your arms crossed you shook your head while swatting down each of my arguments with your annoying logic. Don’t think I didn’t notice the laughter behind your eyes. You always find humor in my tantrums. As I’m sure you know, you were right. After you returned to your life of adventure and quiet contemplation, I sat down to write. I’ll have you know, I loathed every word I typed — at first anyway. Then Love came in to help and by the end, I felt a weight lift from my chest. Just like you said I would.
https://medium.com/inspired-writer/dear-balance-i-need-your-help-with-my-emotions-e8a17e31c834
['Katrina Paulson']
2020-11-15 21:03:07.857000+00:00
['Self-awareness', 'Self', 'Inspriation', 'Emotional Intelligence', 'Creative Non Fiction']
The new kid on the statistics-in-Python block: pingouin
Photo by Ian Parker on Unsplash The new kid on the statistics-in-Python block: pingouin A quick tour of the library and how it stands out from the old guard Python has a few very well developed and mature libraries used for statistical analysis, with the biggest two being statsmodels and scipy . These two contain a lot (and I mean a LOT) of statistical functions and classes that will in 99% of the times cover all your use-cases. So why are there still any new libraries being released? The newcomers often try to fill in a niche or to provide something extra that the established competition does not have. Recently, I stumbled upon a relatively new library called pingouin . Some key features of the library include: The library is written in Python 3 and is based mostly on pandas and numpy . Operating directly on DataFrames is something that can definitely come in handy and simplify the workflow. and . Operating directly on is something that can definitely come in handy and simplify the workflow. pingouin tries to strike a balance between complexity and simplicity, both in terms of coding and the generated output. In some cases, the output of statsmodels can be overwhelming (especially for new data scientists), while scipy can be a bit too concise (for example, in the case of the t-test, it reports only the t-statistic and the p-value). tries to strike a balance between complexity and simplicity, both in terms of coding and the generated output. In some cases, the output of can be overwhelming (especially for new data scientists), while can be a bit too concise (for example, in the case of the t-test, it reports only the t-statistic and the p-value). Many of pingouin ’s implementations are direct ports from popular R packages for statistical analysis. ’s implementations are direct ports from popular R packages for statistical analysis. The library provides a few new functionalities not found in the other libraries, such as calculating different effect sizes and converting between them, pairwise t-tests and correlations, circular statistics, and more! In this article, I provide a brief introduction to some of the most popular functionalities available in pingouin and compare them to the already established and mature equivalents. Setup To start, we need to install the library by running pip install pingouin . Then, we import all the libraries that we will use in this article. Statistical functionalities In this part, we explore a selection of functionalities available in pingouin , while highlighting the differences as compared to the other libraries. t-test Probably the most popular use-case of the statistical libraries (or at least on par with linear regression) is the t-test, which is most often used for hypothesis testing when running A/B tests. Starting with scipy , we calculate the results of the t-test by running: ttest_ind(x, y) What generates the following output: In the case of pingouin , we use the following syntax: pg.ttest(x, y) And receive: scipy reports only the t-statistic and the p-value, while pingouin additionally reports the following: degrees of freedom ( dof ), ), 95% confidence intervals ( CI95% ), ), the effect size measured by Cohen’s d ( cohen-d ), ), the Bayes factor, which indicates the strength of evidence in favor of the considered hypothesis ( BF10 ), ), the statistical power ( power ). Note: statsmodels also contains a class for calculating the t-test ( statsmodels.stats.weightstats.ttest_ind ), which is essentially a wrapper around scipy ’s ttest_ind , with a few modifications to the input parameters. pingouin also contains more variants/extensions of the standard t-test, such as: Pairwise t-Tests, Mann-Whitney U test (a non-parametric version of the independent t-test), Wilcoxon signed-rank test (a non-parametric version of the paired t-test). Power analysis Another very popular application of statistical libraries is to calculate the required sample size for an A/B test. To do so, we use power analysis. Please refer to my previous article if you are interested in more details. In this example, we will focus on the case of calculating the required sample size for a t-test. However, you can easily adjust the code to calculate any of the other components (significance level, power, effect size). For simplicity, we fix the other 3 parameters to some standard values. Running the code generates the following output: Required sample size (statsmodels): 64 Required sample size (pingouin): 64 To be honest, there is not much difference in terms of power analysis for a standard t-test. We can additionally customize the type of the alternative hypothesis (whether it’s a one- or two-sided test). What is worth mentioning is that pingouin enables us to run power analysis for a few tests, which are not available in other libraries, such as the balanced one-way repeated measures ANOVA or the correlation test. Plotting pingouin contains a selection of really nicely implemented visualizations, however, most of them are pretty domain-specific and might not be that interesting to the general reader (I do encourage you to take a look at them in the documentation). However, one of the plots can definitely come in handy. In one of my previous articles, I described how to create QQ-plots in Python. pingouin definitely simplifies the process, as we can create a really nice QQ-plot with just one line of code. np.random.seed(42) x = np.random.normal(size=100) ax = pg.qqplot(x, dist='norm') What’s more, pingouin automatically handles how to display the reference line, while in the case of statsmodels , we need to do the same thing by providing an argument to the qqplot method of the ProbPlot class. A clear example of simplifying the task! Quoting pg.qqplot ’s documentation: In addition, the function also plots a best-fit line (linear regression) for the data and annotates the plot with the coefficient of determination. ANOVA We will now investigate how to run ANOVA (analysis of variance). To do so, we will employ one of the built-in datasets describing the pain threshold per hair color (interesting idea!). First, we load and slightly transform the dataset. We dropped one unnecessary column and replaced the space in column names with an underscore (this will make implementing ANOVA in statsmodels easier). First, we present the pingouin approach. What generates the following output: Before moving further, we should mention the potential benefit of using pingouin : it adds an extra method ( anova ) directly to the pd.DataFrame , so we can skip calling pg.anova and specifying the data argument. For comparison’s sake, we also carry out ANOVA using statsmodels . Using this library, it is a two-step process. First, we need to fit an OLS regression, and only then carry out ANOVA. The outputs are very similar, although the one from pingouin contains an additional column, which is the effect size measure as partial eta-squared. Similarly to the t-test, there are different variants of ANOVA included in pingouin . Linear Regression As the last example, we inspect one of the most basic machine learning models — the linear regression. To do so, we first load the famous Boston Housing dataset from scikit-learn : from sklearn.datasets import load_boston X, y = load_boston(return_X_y=True) Note: scikit-learn also contains the class to train linear regression, however, the output is the most basic of all the known to me Python libraries. In some cases, that is perfectly fine. However, coming from R, I prefer a more detailed output, which we will see soon enough. To fit a linear regression model in pingouin , we need to run the following line of code. lm = pg.linear_regression(X, y) lm It can’t get simpler than that! And the output looks as follows: The table is quite big and detailed. Personally, I don’t think including R2 and the adjusted variant makes a lot of sense here, as it results in a lot of repetition. My guess is that it was done to keep the output in the form of a DataFrame . Alternatively, we can turn that behavior off by setting as_dataframe=False , what results in creating a dictionary instead. This way, we additionally get the residuals of the model. One extra thing about pingouin ‘s implementation is that we can extract a measure of feature importance, which is expressed as “partitioning of the total 𝑅2 of the model into individual 𝑅2 contribution”. To display them, we need to set relimp to True . It is time to move forward to the statsmodels implementation. The code is a bit lengthier, as we also need to manually add the constant (a column of ones) to the DataFrame containing the independent variables. Please note that this step is not necessary when using the functional syntax (as used in the ANOVA example), however, in that case, the features and the target need to be in one object. The following image presents the output of running the summary method on the fitted object: Anyone coming from R will recognize this format of the summary :) For me, the extra two lines of code are definitely worth it, as the output is much more comprehensive and often saves us the trouble of running a few extra statistical tests or calculating a few measures of the goodness of fit. Other interesting features In this article, I only presented a selection of the functionalities of the pingouin library. Some of the interesting features available are: A wide range of functions for circular statistics, Pairwise post-hocs tests, Different Bayes Factors, A selection of different measures of effect size and a function for converting between them. And more! Conclusions In this article, I presented a brief overview of pingouin , a new library for statistical analysis. I do like the approach taken by the authors, in which they try to simplify the process as much as possible (also by making some things happen automatically in the background, like choosing the best reference line for a QQ-plot or applying corrections to the t-test), while at the same time keeping the output as thorough and complete as possible. While I am still a fan of the statsmodels ’s approach to summarizing the output of linear regression, I do find pingouin a nice tool that can save us some time and trouble in day-to-day data science tasks. I am looking forward to seeing how the library develops with time! You can find the code used for this article on my GitHub. As always, any constructive feedback is welcome. You can reach out to me on Twitter or in the comments.
https://towardsdatascience.com/the-new-kid-on-the-statistics-in-python-block-pingouin-6b353a1db57c
['Eryk Lewinson']
2020-06-05 11:54:11.067000+00:00
['Machine Learning', 'Data Science', 'Python', 'Education', 'Statistics']
A TypeScript Project Structure Guide
General tsconfig.json Options Option “extends” Option “extends” specifies another configuration file to inherit from, and may probably be a good starting point of your configuration. There are community maintained base configurations tuned to particular runtime environments that you can install and inherit in your projects. See project @tsconfig/bases. Option “rootDir” “rootDir” specifies the root that is expected to contain all Implementation Files. “rootDir” is expected to contain all Implementation Files, i.e. source files that will be compiled. By default “rootDir” is inferred as “the longest common path of all non-declaration input files”. This means if all your input source files are under “./src”, “./src” is inferred as the “rootDir”. If your source files are under both “./src” and “./test”, then “./” would be the “rootDir”. Don’t confuse “rootDir” with “rootDirs”. They are different. Output Options These options set what the emitted JavaScript is like (so that they are compatible with different runtime environments), what other products are produced, and where are they outputted. Option “outDir” If specified, the output .js (as well as .d.ts , .js.map , etc.) files will be emitted into this directory. In most TypeScript projects, “outDir” should be set. Otherwise, outputs will be emitted besides the sources, polluting the source folders. The directory structure under “rootDir” will be preserved in “outDir”. That means the JavaScript compiled from “rootDir/pathA/pathB/moduleC.ts” will be emitted to “outDir/pathA/pathB/moduleC.js”. Altogether, the rules are: “rootDir” contains all Implementation Files to be compiled, and “outDir” is where the outputs are emitted. The same directory structure under “rootDir” is preserved in “outDir”. Some options discussed in this article allow a very flexible directory structure. However, under no circumstance this rule changes. Option “target” Meant to be set according to the runtime environment that the emitted JS code will be executed in. This option determines what features in the emitted JS are downleveled, i.e. what features implemented in a newer version of the Javascript language should not be emitted in the output Javascript file, but rather be implemented with the syntax and features of an older version, so as to be compatible with an older runtime. Option “target” changes the default values of options “lib”, “module”, and through option “module”, option “moduleResolution”. This means if “target” is set correctly, these options may probably be left to use the default values. Option “module” This option determines the module import & export codes in the emitted JavaScript. It affects the emitted JavaScript at runtime, not the module resolution of TypeScript at compile time. The default value depends on the option “target”. “CommonJS” is what you very likely want. Option “declaration” If this option is set to “true”, the compiler generates type declaration “.d.ts” files. If the project is to be used by other codes, this option should be turned on. Option “declarationMap” If “declaration” is turned on, option “declarationMap” makes the compiler emit source maps for the type declarations (.d.ts), mapping definitions back to the original “.ts” source files. The effect is when the user clicks a definition, editors (such as VS Code) can go to the original .ts file when using features like “Go to Definition”. “package.json” Options These options define the exposed interface of the package through exports of types, functions, values, objects, as well as the entry point that is executed when the package is imported. Option “types”, “typings” Indicates the package’s type declaration file (.d.ts). “typings” is synonymous with “types” and both can be used. If “index.d.ts” lives at the root of the package, “types” can be omitted, although still recommended. If the package is to be used by external codes, “types” should be indicated. Option “main” This is the primary JavaScript entry point to the package. Commonly, “types” points to the declaration file, and “main” points to the .js emitted from the TypeScript entry point in the “outDir”. Option “exports” As an alternative to the “main” option, this option supports subpath exports and conditional exports. As of today under Node.js v15.0.1, both features are still experimental, therefore will not be further discussed.
https://medium.com/javascript-in-plain-english/typescript-project-directory-structure-module-resolution-and-related-configuration-options-1d8b87ffec88
['Bing Ren']
2020-11-30 22:17:38.104000+00:00
['Typescript', 'Coding', 'JavaScript', 'Programming', 'Web Development']
We Have All Been Mulla
Photo by Tatiana Rodriguez on Unsplash We Have All Been Mulla Storytelling Thursday I wish to make it clear I have not to this point of my life ever hidden in an open grave. Although I have doubts that I ever would, I do believe that given the only option for saving my life, I would consider it. I do know that, because of my imagination or perception of a situation, I have made changes in what I did. I think all of us have done, to a lesser degree, what Mulla did. I know many women avoid walking alone at night because of what might happen. I know many times when I am alone walking on the street that women look away or down when they get near. They do not know that I am not a threat. I know that I at times feel nervous when I, again, am walking alone and see a group of males all together. If they are all in business suits, I am not as worried but if they are teens or twenty-somethings, I too might avoid eye contact. I might also give a confident hello to let it seem that I am totally comfortable in the situation. How many of us have seen a suspected vicious dog and avoided it only to find out that it was a friendly loving adult puppy? How many of us have been afraid to ask a question of a gruff looking official person only to find them extremely helpful? How many of us have avoided doing something only to find it was fun and enjoyable? How many of us hesitated bringing up a concern about something only to find the people involved helpful and understanding? How many of us have found valuable relationships with people we were initially intimidated by? I am a sports fan. I am a fan of The Buffalo Bills American football team. I am a fan of the New York Yankees baseball team. I have been to games at their home stadiums and have always dressed in jerseys and clothes that indicate I am a fan. I actively cheer for them. I have seen fans of our team, usually drunk at the time, make fans of the other team wearing their colors feel very uncomfortable and even threatened. I have been in other stadiums when wearing my teams jersey been made to feel unsafe. Enough so that people I was with would not let anyone go even to the restroom by themselves. I have had amazing times with fans from other teams at stadiums and in bars and such. When I was in University we used to cheer loudly and passionately for our team. Enough so we would have the other team make mistakes because of how distracted we made them. I also remember having players come up to us after the game and tell us how much they enjoyed playing at our university because of that. They played harder and enjoyed our loyalty. I think to a certain point we are all like Mulla. I also think that most of us are like the horsemen. Each year I go to auto race with a few friends. One of the people is Dave. Dave is a gentle giant. His hands are so big that mine literally disappears when he shakes it. One year there was a disturbance and the Police came by our campsite. Dave was asleep in his tent with his feet sticking out the opening. The people still up, drunk, had made too much noise and the police were there to stop that. One of the policemen saw Dave’s feet and decided he must be hiding and part of the problem. He loudly made him wake and asked him to get out of his tent. Dave woke up and made his way out of the tent. The officer, first seeing that he had obviously been sleeping and then seeing his size looked at him and said: “Oh my sorry, you need to go back in there.” All of us besides Dave broke into laughter. Dave just shook his head and crawled back in. As the police were leaving one of the drunks remarked loudly, even the smartest policeman should know better than to wake a sleeping giant. So you see Dave was there because the policemen were there. The policemen thought they were there because Dave was there. We were there to laugh at the situation.
https://medium.com/know-thyself-heal-thyself/we-have-all-been-mulla-73c75ecdd716
[]
2020-12-11 06:32:23.639000+00:00
['Misunderstanding', 'Energy', 'Short Story', 'Storytelling', 'Awkward Situations']
I Lost My Job A Week Before Thanksgiving
Thanksgiving 2015 is one holiday I will never forget. A week before the holiday, I lost my job. It did not come as a suprise. I was the target of a bully. The bully was my boss, and for eight months the boss had made my life a living hell. The firing almost felt like a relief. Still, it was a sucker punch that sent me reeling. I worried about was my family, and how this would impact them. I worried about myself and how I would find a job again. It was the saddest period of my life. The Blackest Friday It was a hard holiday season. I ate my feelings and all the carbs on Thanksgiving. I justified it by thinking that eventually I wouldn’t be able to afford food so I better eat for survival. During dinner, I listened to family members talk about the deals they were going to get on Black Friday. I couldn’t enjoy any of it. I couldn’t afford any of it. I couldn’t stand any of it. I went home and cried myself to sleep. I woke up on Black Friday. I dealt with the reality that I was jobless, with no prospects in sight. All set against the backdrop of holly jolly holidays. I felt sad and alone. In November and December, hiring for jobs slows down. Employers often focus on wrapping up tasks for the year instead of hiring new employees. I started to panic, knowing that it would likely be months before I could land a job. I started to really sink in to my despair and depression. I stayed in bed all day, claiming to be sick from the Thanksgiving meal. I felt like I couldn’t move, that if I stayed in bed, I wouldn’t have to deal with anything. I drifted in and out of a restless sleep the rest of the day. Saturday finally came. I laid in bed, staring at the ceiling. I couldn’t go through another day like Black Friday. I needed a plan. I got out of bed, grabbed my journal and started to write. Taking back control on that Saturday after Thanksgiving was the best thing I could do for myself. My plan was simple: I concentrated on small things I could do as I began to search for work. These are all the things I did to begin my health process from job loss. Taking positive action was the key to surviving the holidays. The Plan Be there for family. I knew that the only thing worse than job loss at the holidays was ruining the holidays completely with my sadness. For the sake of my family, I decided to buck up and make the most of things. I concentrated on cleaning, decorating, and do all I could to by a Who in Whoville against the Grinchi-ness of being without a job. Turns out, giving into the holiday spirit helped me as much as it helped them. Find friends. Obvious, right? But it is your nearest and dearest who will get you through these moments. I really leaned on my friends. These are the times that define friendship. Volunteer. I had extra time on my hands, so I found a non-profit that needed help with writing, cleaning, and all kinds of odd jobs. It allowed me to stay busy, contribute, and network all at the same time. It gave me a purpose. Make. Bake. Rake. Take. I was low on cash but I wanted to show people I cared for them. So, I made homemade gifts like bookmarks and knitted scarves. I baked treats, using up ingredient in my pantry. I raked and did odd household jobs to help out. I volunteered to take friends, family, kids on errands, or just hung out with them. I gave all the time I had. Refresh. I looked at all the stuff I had, and realized I already had more than I needed. I donated stuff to charity. I cleaned, and re-arraned my furniture. I went through my closet and tried on everything before either selling it or keeping it. My kitchen was re-organized. Refreshing made me look at the things I took for granted. Go the distance. I set mini goals for my health and fitness. I went for walks, and explored new places. I started to practice holding plank position and worked my way up to two minutes straight. I played in the snow. Next to the love of family and friends, my health became my most valuable possession. Books. And more books. I read each day. All the books I had stacked around the house were finally getting attention. I even renewed my library card. Filling my head with beautiful words and ideas was a joy. Wrote my way out. The song “Wrote My Way Out” from the Hamilton Mixtape album is a favorite of mine. The song is all about having nothing except words, and using those words to write the way out of difficult circumstances. When I could feel myself becoming depressed, I started writing. The writing helped me through the darkest times. Fix it. Repair instead of despair after job loss became my mantra. Repair the stuff in at home that stopped working months ago. Repair relationships. I reach out and told people I was thinking of them. I said I was sorry for past mistakes. Fixing became a new form of freedom. Routine. One of the hardest things when you are suddenly jobless is establishing a routine. I had to re-establish purpose in my life. I forced myself to wake up, exercise, make my bed and shower every day, no excuses. I took baby steps at first, and it eventually became easier. Routine created a new rhythm for my life. Happy New Year When I got to December 31st, I took a deep breath and made a plan for the new year. My plan was built on the positive things I did during the last twelve months. With that mindset, I landed a new job by February. It wasn’t as much money as I had been making as before, but I was grateful for the opportunity. Since I had been living with less, the transition to a new role was easy. For anyone facing job loss, just know that every day you get out of bed and meet the world, it’s a win. Look forward to 2020 knowing that action and adventure awaits, and you will be ready to take that first step. I hope these steps help someone who is facing this difficult reality. These tough moments in life do pass. You will emerge from it stronger and wiser. Just don’t give up.
https://medium.com/rejectrevolution/i-lost-my-job-a-week-before-thanksgiving-3a03171d1c16
['Ree Jackson']
2019-11-24 22:32:14.794000+00:00
['Thanksgiving', 'Work', 'Mental Health', 'Jobs', 'Life']
What Does Subtracting a Char From a Char Mean?
Actually, they kind of are variables However, for this example, each char is kind of a variable. What I mean by that is that b and a in this example actually can be represented by ASCII number values summarized by the below chart: Public domain. If you look on the right-hand side there, you’ll see that b corresponds to a decimal value of 98 and a corresponds to a decimal value of 97 . When you subtract the characters a from b , then, you get a result of 1 ! With this, the below code block (written in Java, but the concept applies to many other languages) makes sense: In that case, the output would be Wow! The value of charSubtractionResult and integerSubtractionResult are both 1! . Uppercase and lowercase characters are different Something to keep in mind is that the uppercase and lowercase values for an English letter are different. a , which is lowercase, has the aforementioned value of 97 . Its uppercase counterpart, A , though, has a decimal value of 65 ! When you’re doing math like this, remember that it’s case-sensitive. …Why would you ever need to do this? That’s a fair question. Aside from “impressing” others — emphasis on the quotation marks there — this is actually something I’ve seen in code where developers are trying to keep track of how many characters are in a character array. It’s probably best I give an example to illustrate this: So basically, I created an array of integers of length 26 . All the integers’ values are set to 0 by default. The purpose of this array is to keep a count of how many instances of each letter in the English alphabet there are in the string I’ve defined, favoriteFood , which I’ve set the value of to pasta . In the loop, I want to go through the string and then populate the array with the count for each character. So, for pasta , the array I want would look like this: [2,0,0,0,0,0,0,0,0,0,0,0,0,0,0,1,0,0,1,1,0,0,0,0,0,0] // a,b,c,d,e,f,g,h,i,j,k,l,m,n,o,p,q,r,s,t,u,v,w,x,y,z Basically, in the string pasta , there is one instance of the character p , two of the character a , one of the character s , and finally one of the character t . However, the way that I’m trying to access the value to increment is wrong, and it’s because the favoriteFood.charAt(i) simply returns the character at the position. If the program tries to access the index of a character, then it’ll fail — rather, it needs to access the index of an integer. So subtract a char from a char to get an integer! You read my mind: that’s where the ASCII character mapping comes into play. Here’s the correct code: This is actually a clever way of getting the index of the character’s count in the array. Let’s go through the string pasta to illustrate what I mean. First, we have the letter p , which from the ASCII table above we know has a character value of 112 . When we subtract a , or 97 , we have a value of 112-97=15 . When the program accesses the 15th index of the charCounts array, that is literally the count of how many p characters there are in the string since p is the 16th character of the alphabet! This is a little tricky because remember that array indexes always start at 0. Thus, there are sixteen indices from 0 to 15. Let’s keep going to drive this point home. a has a value of 97 , and when we subtract a , or 97 , from it, we’re left with 0 . charCounts[0] correctly corresponds with the number of instances of a in the string.
https://medium.com/swlh/what-does-subtracting-a-char-from-a-char-mean-79da714c1b3b
['Tremaine Eto']
2020-12-24 14:16:14.550000+00:00
['Software Development', 'Coding', 'Technology', 'Software Engineering', 'Programming']
How to Make a Custom Slider Component in Vue
How to Make a Custom Slider Component in Vue Creating a custom slider component can be tricky, especially if you want to create a lean standalone Vue component. The post How to Make a Custom Slider Component in Vue first appeared on Qvault. Creating a custom slider component can be tricky, especially if you want to create a lean standalone Vue component. In this quick article, you’ll learn how to build a fully customizable slider component in Vue. Feel free to swap out the majority of the CSS to get the styling you want, but I’ll give you a good jumping-off point. In fact, the component we’ll be building is the exact same component that we use in production, and you can see it in action in the signup workflow for our coding courses. You can see a full demo on codesandbox here. If you’re like me, you prefer to build your own lightweight UI components, rather than import a bloaty library that you don’t have the ability to modify and change easily. The HTML <template> <div> <div class="slider-component"> <div class="slidecontainer"> <input ref="input" v-model="currentValue" type="range" :min="min" :max="max" class="slider" @input="onInput" > </div> </div> </div> </template> That wasn’t so bad right? We are building out the data model in such a way the in order to use the component we can sue the built-in v-model property. The JavaScript export default { props: { value: { type: Number, required: true }, min: { type: Number, required: true }, max: { type: Number, required: true } }, data(){ return { currentValue: this.value }; }, methods: { onInput() { // this.currentValue is a string because HTML is weird this.$emit('input', parseInt(this.currentValue)); } } }; Like I mentioned above, this sets up the use of v-model . We set the default currentValue to the this.value prop, and by emitting the current value with the @input hook, we are good to go. The CSS You may not be here for exactly my styling, but you’re probably here so that you can swap out the styling. Feel free to copypasta my CSS and swap it our for your own sutff! .slider-component .slidecontainer { width: 100%; } .slider-component .slidecontainer .slider { -webkit-appearance: none; appearance: none; width: 100%; height: 4px; border-radius: 2px; background: #c2c2c2; outline: none; opacity: 0.7; -webkit-transition: .2s; transition: opacity .2s; } .slider-component .slidecontainer .slider:hover { opacity: 1; } .slider-component .slidecontainer .slider::-webkit-slider-thumb { -webkit-appearance: none; appearance: none; width: 18px; height: 18px; background: #D8A22E; cursor: pointer; border-radius: 50%; } .slider-component .slidecontainer .slider::-moz-range-thumb { width: 18px; height: 18px; background: #D8A22E; cursor: pointer; border-radius: 50%; } The important thing to note here is that we’re overriding the browsers defaults and setting up all of our own stuff. Full Component <template> <div> <div class="slider-component"> <div class="slidecontainer"> <input ref="input" v-model="currentValue" type="range" :min="min" :max="max" class="slider" @input="onInput" > </div> </div> </div> </template> <script> export default { props: { value: { type: Number, required: true }, min: { type: Number, required: true }, max: { type: Number, required: true } }, data(){ return { currentValue: this.value }; }, methods: { onInput() { // this.currentValue is a string because HTML is weird this.$emit('input', parseInt(this.currentValue)); } } }; </script> <style scoped> .slider-component .slidecontainer { width: 100%; } .slider-component .slidecontainer .slider { -webkit-appearance: none; appearance: none; width: 100%; height: 4px; border-radius: 2px; background: #c2c2c2; outline: none; opacity: 0.7; -webkit-transition: .2s; transition: opacity .2s; } .slider-component .slidecontainer .slider:hover { opacity: 1; } .slider-component .slidecontainer .slider::-webkit-slider-thumb { -webkit-appearance: none; appearance: none; width: 18px; height: 18px; background: #D8A22E; cursor: pointer; border-radius: 50%; } .slider-component .slidecontainer .slider::-moz-range-thumb { width: 18px; height: 18px; background: #D8A22E; cursor: pointer; border-radius: 50%; } </style> Related Custom Vue Components Thanks For Reading! Take computer science courses on our new platform Follow and hit us up on Twitter @q_vault if you have any questions or comments Subscribe to our Newsletter for more programming articles
https://medium.com/qvault/how-to-make-a-custom-slider-component-in-vue-579202388fa3
['Lane Wagner']
2020-11-25 15:28:59.796000+00:00
['Coding', 'Software Development', 'Vuejs', 'Front End Development', 'Vue']
A Beginner’s Guide to Hypothesis Testing
Significance level, P-value & Confidence Level The significance level is represented by the Greek letter alpha (α). The common values used for alpha is 0.1%, 1%, 5%and 10%. A smaller alpha value suggests a more robust interpretation of the null hypothesis, such as 1% or 0.1%. The hypothesis test returns a probability value known as p-value. Using this value we can either reject the null hypothesis and accept the alternate hypothesis or accept the null hypothesis. p-value= Probability(Data | Null Hypothesis) p-value <= alpha: Reject the null and accept the alternate hypothesis p-value > alpha: Failed to reject the null hypothesis Let us experiment on the above hypothesis by flipping a single coin five times. Experiment Performed: After flipping the coin five times, we got five heads in a row (X) = 5 Considering alpha = 0.05 p-value :probability(X=5 |Ho) Result: No. of events in possible outcomes with all five heads = 1 So,P(X=5 |Ho) = 1/32 = 0.03 0.03 signifies that there is only a 3% chance of getting all five heads in a row which is less than alpha. P(X= 5 |Ho) = 0.03 < alpha( 0.05) As the ground truth observed cannot be rejected, hence the null hypothesis(Ho) is rejected, and the alternate hypothesis is accepted. The confidence level can be obtained by subtracting the significance level of the hypothesis from 1 of the given observed sample data. Confidence level = 1 — significance level (α)
https://medium.com/analytics-vidhya/a-beginners-guide-to-hypothesis-testing-f34d6332c1cc
['Ashwini Atmakuri']
2020-11-08 17:10:23.972000+00:00
['Machine Learning', 'Data Science', 'Artificial Intelligence', 'Statistics', 'Hypothesis Testing']
NaNoWriMo 2020, Day 1
I did not wake up this morning motivated to write. I was sluggish after drinking wine last night with friends in my “pod.” But yesterday I declared on Twitter: So I got up, made coffee, opened my writing notebook, and inked my way through three pages of meandering thoughts including the never helpful, what am I doing with my life? Word count: 378. Not terrible. I updated my word count on the NaNoWriMo website, and saw I’d earned a badge: Spoiler alert: you’ll see this when you update today! Then I walked my dog, talked to a friend about all the things, and when I said it is National Novel Writing Month, he exclaimed, “I want to write a novel!” YES! yes YES yes YES yes YES! “Wonderful!” I said to my friend, “Go sign up on the site and let’s do this together!” My friend is a talented fine artist and hasn’t written a novel before. But that’s what November’s for — to take someday ambitions and turn them into today. That’s what November’s for — to take someday ambitions of writing a novel and turn them into today. At home, I refilled my coffee and hopped onto a Google Meet with my writing group friend, Kelly Dennehy. Since shelter-in-place we’ve met virtually every Sunday to talk about our novels and write for an hour. The possibly, probably rebellious secret sauce to my 8-year winning streak is that I count all the words I write in November. I’d count my grocery list if I needed to get to 50K. I would love to hear how Day One of your novel writing is going! Leave a response. Or write a post for the publication, then leave me a private note and I’ll add you as a writer to the Friends of NaNoWriMo Medium publication. Your novel matters! Julie 💜 ✍️ 💜 Word count: 2076 😌
https://medium.com/nanowrimo/nanowrimo-2020-day-1-8b602615e5df
['Julie Russell']
2020-11-02 13:52:50.578000+00:00
['NaNoWriMo', 'Nonprofit', 'Fiction', 'Writing']
Is the chain different for everyone?
This is what I asked myself on days like today, where I’m about to go to bed, exhausted and fighting to stay awake, yet suddenly remembering that I pledge to get back into writing everyday just two days ago. I wonder if not breaking the chain means different things to everyone, if it’s an absolute definition or if it is just me looking for a way to justify my inclinations towards just fallen asleep instead of fulfilling my commitment. There’s people that write from Monday to Friday, there’s people that write everyday, is one group better than the other? Is it a matter of preference? There are definitely success stories on both sides. I don’t have an answer, and there’s only one way to figure it out, but so far at least I know which side I’m on. aThe mr
https://medium.com/thoughts-on-the-go-journal/is-the-chain-different-for-everyone-1058750d810b
['Joseph Emmi']
2018-10-27 23:18:44.182000+00:00
['Discipline', 'Commitment', 'Goals', 'Writing', 'Self Improvement']
Why is everyone so afraid of the Disney-Fox Merger?
Ryan Reynolds tweeted this image saying, “Feels like the first day of Pool,” on the day the merger was complete. It was announced back in December of 2017 that Disney had begun negotiations to acquire 20th Century Fox. This was exciting news for me, mainly because I’m a Marvel fan and this meant that the X-men would be coming home to Marvel and we would finally (hopefully) get to see Magneto tee-off against Iron-Man (because honestly, who wouldn’t love to see that). Unfortunately, I was surprised to see that so many of my fanboy colleagues were not quite as excited as I was. My colleagues feared that Disney acquiring the “darker” half of Marvel would mean that we would no longer get R-rated versions of Marvel properties like Logan or Deadpool. I tried to assure them that Fox would remain it’s own entity and it was good news because this meant that Fox would be sure to stay in business. Disney owning Fox was good news because it meant that in the face of mass media conglomeration and technological forces like Netflix and Amazon, Disney was beefing itself up in order to compete. I actually wrote a lengthy article about it here. Yet, in turning to the blogosphere, I sensed much the same trepidation from journalists, bloggers, commentators, talking heads, and so-called “professionals” as I did from my leigh friends. My initial reaction was to assume that most people must just not know how business works. Or possibly they don’t understand the looming threat of Netflix or Amazon or just streaming in general. Then today I opened up my daily NYT Dealbook Briefing and after scrolling through a bit I saw this: Credit: NYT Dealbook Briefing What people don’t seem to understand is that not so long ago, Disney very easily could have gone bankrupt. Their turnaround is a success that should be cheered, not a corporate takeover to be scorned. How quickly people forget. “Their turnaround is a success that should be cheered, not a corporate takeover to be scorned.” Flashback to 2007, Disney was in a position of speculatively selling off some of it’s assets. Amusement park numbers were down, they had literally a decade of lukewarm box office returns, and Disney Animation studios was in talks of being shutdown. In 2006, Disney purchased Pixar studios. Pixar was coming off a string of huge hits with Toy Story, Monster’s, Inc., and The Incredibles. Execs figured they didn’t need two animation studios, so they planned on shuttering Disney Animation Studios. But Ed Catmull and John Lasseter, not wanting Walt’s legacy to die, set on task to bring Disney Animation back from the dead. I’m assuming you know the rest; Wreck-it Ralph, Frozen, Zootopia, Moana, etc. At the same time, the newly acquired Marvel Studios set out to essentially launch an entirely new studio using Marvel’s class-B characters. Facing bankruptcy, Marvel sold off rights to their best characters like X-men, Fantastic Four, and Spider-man. We all know the story, it was a huge gamble that remarkably paid off. (Infinity War made over $2 billion worldwide.) Similarly, knowing that people craved spectacle, Disney acquired Lucasfilm. People tend to think that relaunching Star Wars was a shoo-in, but that’s not necessarily the case. The Force Awakens was a critical and commercial success, but if you look at Rogue One, The Last Jedi, and Solo; while they may have had commercial success, were largely scorned by critics. As it turns out, trying to please hardcore fans is a lot easier said than done. Lastly, and most importantly, consumer choice has exploded over the last few years. People are now able to watch what they want, when they want, and how they want. Studios are no longer competing against each other (if they ever were); they are now competing against new technologies and new business models. The average consumer sees approximatley four movies a year (which is predicted to be about 246 million theater goers or 1.32 billion tickets sold as of 2018). Compare that to Netflix who has 139 million subscribers, each paying $12 a month. And that doesn’t even include Hulu or Amazon, or people who only watch TV and don’t see movies at all. The point is that media conglomeration is an inevitability. Just look at the stats on media M&A currently and you’ll see my point. Disney is working to save the theater-going experience, not kill it like so many people seem to think. And to be clear, I love Netflix, I love Amazon, I love YouTube, and I love all the choice we have these days. Frankly, I think the competition is making movies better and better. This a good thing for movie lovers. I just wish more people realized it.
https://chasebanderson.medium.com/why-is-everyone-so-afraid-of-the-disney-fox-merger-63da21c267ae
['Chase B Anderson']
2019-03-21 02:51:52.759000+00:00
['Movies', 'Netflix', 'Disney', 'Amazon', 'Mergers And Acquisitions']
Snapchat, Instagram, Facebook, and Twitter
Alarm sounds. 5am. My eyes open. I grab my phone and instantly start scrolling… There is no doubt in my mind that social media has changed the way individuals communicate with one another. We are constantly liking, commenting on, and creeping on everyone else’s shit. And it seems like we can’t get enough. The question is, how do we take advantage of this? Facebook currently has over 1.23 billion monthly active users and Snapchat reels in a respectable 200 million monthly active users. With numbers like this, it’s hard to disagree that social media is where the people are. And if you know anything about business, you know that you should be focusing on where the people are and will be, not where you think they are or where they used to be. Big corporations are just starting to realize the importance that social media has. Brands like Amazon, Mountain Dew, and Coca-Cola are just now hopping on the Snapchat and Instagram bandwagon, and they’re pleasantly surprised with the response. Now you may ask yourself, “Instagram…Okay I guesssssss, but why would Coke want to go on Snapchat?” A year ago… hell, maybe even 6 months ago I would have agreed with you 100%. But the numbers speak for themselves, 45% of 200 million (90,000,000) Snapchat users are aged between 18–24 years old, and I’d estimate that roughly 85% of the 90,000,000 millennials like to drink Coca-Cola are on Snapchat. So it would be a huge mistake to not advertise on a platform with over 76 million users in their target market. I also know that unless you’re a freshman in college, students save hundreds of dollars by buying books off Amazon, rather than getting ripped off by their school bookstore. And 90% of those students are also on Snapchat and Instagram. Companies that are willing to stick their necks out on the line and take a step back from a traditional marketing approach are currently REAPING in the benefits. These social media platforms are becoming more and more powerful. Snapchat, Instagram, Facebook, and Twitter for our generation are like CBS, NBC, CNN, and FOX for baby boomers. Young adults are spending more and more time in front of their phones and less time in front of their TVs (excluding Netflix). So how do you manage this transition correctly and leverage it to your advantage? 1. Concentrate on your users, audience, customers, and fans. Figure out what they want. Develop a deep connection with them. Build Trust. 2. Improve your skill, expertise, competence, and product. How can you get better? How can you make your product better for the people who use it? 3. Engage with your users Reach out to your users. Interact through comments. Reply to them. Make sure they know you’re message is genuine. By using these three steps, I was able to grow my Twitter following from 300 to over 1000 and my Instagram following from roughly 1000 users to over 4600 in just a 6 month period.
https://medium.com/n2-media/snapchat-instagram-facebook-and-twitter-8cbba882e65b
['Brandon Poplstein']
2016-10-24 17:04:36.332000+00:00
['Digital Marketing Tips', 'Social Media', 'Social Media Marketing', 'Marketing', 'Digital Marketing']
Add a Tab Bar and Navigation Bar with iOS style in your next Flutter app
If you come from iOS and you are used to working with Swift you probably know it is super easy in xCode to add a Tab Bar in your app using the storyboard: you just drag it to your screen and voila it’s done! The main benefit of a iOS Tab Bar is that it allows you multi tasking. In iOS as you select tabs and descend into multiple viewcontrollers the app saves your state. If you move in a new tab the app still keeps the state of the other tab alive and when you change tabs back to the first one you are still at that same descended view. Flutter makes a BottomNavigationBar class available to you however this comes with a typical Android single tasking behaviour: if you move from one tab to the other you basically lose the state in the other tab and if you go back to your original tab, you start from the begining. There are some solution you may want to explore that deal with Activities and Fragments and maintaining their states however these are all very complex solution Today I am going to show you how to easily add a iOS like Tab in your flutter app. We are going to achieve it by leveraging the CupertinoTabBar class Overview of Tab structures Before jumping to the code I would like to provide you a representation of how the various classes we are going to use interact between each other. Here are the classes we are going to use CupertinoApp CupertinoTabScaffold CupertinoTabBar BottomNavigationBarItem CupertinoTabView Structure of iOS like Tab Bar navigation Oh that’s a lot… you said this is easy!? Well it is small amount of code but technically not simple since here we are forcing a behaviour which is not an Android standard Lets jump into the code Adding bottom navigation bar First things to do is to have a CupertinoApp class which wrap all the other widgets. You can add a CupertinoThemeData theme to specify the attributes of your Navigation widgets, for example you can set the the colors of the icons in the bottom. You should also declare as many GlobalKey as many tabs you want to have in your app. We are going to use a CupertinoApp and we will differentiate the UI between iOS and Android, so it is important to ensure there is a Localizations widget ancestor for the Android widget. final GlobalKey<NavigatorState> firstTabNavKey = GlobalKey<NavigatorState>(); final GlobalKey<NavigatorState> secondTabNavKey = GlobalKey<NavigatorState>(); final GlobalKey<NavigatorState> thirdTabNavKey = GlobalKey<NavigatorState>(); void main() { runApp( new CupertinoApp( home: new HomeScreen(), localizationsDelegates: const <LocalizationsDelegate<dynamic>>[ DefaultMaterialLocalizations.delegate, DefaultWidgetsLocalizations.delegate, ], ), ); } After that you need declare your HomeScreen and return a CupertinoTabScaffold. This class has 2 main property: tabBar: this is visible bar at the bottom of your screen. You can add a CupertinoTabBar and inside your CupertinoTabBar you can add your BottomNavigationBarItem items which allow for a text and an icon above all tabBuilder: this is other visible part of the screen above the tabBar and where you will have most of the actions. Here you can return as many CupertinoTabView as you need, in our case we have three tabs therfore we need three CupertinoTabView. class HomeScreen extends StatefulWidget { @override _HomeScreenState createState() => _HomeScreenState(); } class _HomeScreenState extends State<HomeScreen> { @override Widget build(BuildContext context) { return CupertinoTabScaffold( tabBar: CupertinoTabBar( items: [ BottomNavigationBarItem( icon: Icon(Icons.home), title: Text('Tab 1'), ), BottomNavigationBarItem( icon: Icon(Icons.map), title: Text('Tab 2'), ), BottomNavigationBarItem( icon: Icon(Icons.account_circle), title: Text('Tab 3'), ), ], ), tabBuilder: (context, index) { if (index == 0) { return CupertinoTabView( navigatorKey: firstTabNavKey, builder: (BuildContext context) => MyFirstTab(), ); } else if (index == 1) { return CupertinoTabView( navigatorKey: secondTabNavKey, builder: (BuildContext context) => MySecondTab(), ); } else { return CupertinoTabView( navigatorKey: thirdTabNavKey, builder: (BuildContext context) => MyThirdTab(), ); } }, ); } } You obviously need to create as many builder as needed, in our case MyFirstTab(), MySecond etc. For the purpose of this tutorial i am just going to have a different color for each tab but of course in your real app you need to add the functionality of your app class MyFirstTab extends StatelessWidget { @override Widget build(BuildContext context) { return Container( color: Colors.redAccent, ); } } class MySecondTab extends StatelessWidget { @override Widget build(BuildContext context) { return Container( color: Colors.greenAccent, ); } } class MyThirdTab extends StatelessWidget { @override Widget build(BuildContext context) { return Container( color: Colors.blue, ); } } If you run the project this is what you are going to get Adding top navigation bar Many app use a top navigation bar to allow users to navigate through the app screen. This is something you can do by adding an AppBar or a CupertinoNavigationBar widget. Let’s ensure to have the right bar depeing if it is Apple or an Android device and let’s create a class and call it every time you need to add it to a tab. This will make the code cleaner and allow you to change the bar modifying only one class First detect if it is an Apple or an Android device import 'dart:io'; void initState() { Platform.isIOS ? isIOS = true : isIOS = false; } Then write a class to show up the relevant bar. Note: it is important that you define an unique heroTag for each iOS navigation bar. This is required for the navigation through the varios screen within the same tab. If you do not have an heroTag the app will not be able to determine which screen to go back to. class MyTopBar extends StatelessWidget { final String text; final TextStyle style; final String uniqueHeroTag; final Widget child; MyTopBar({ this.text, this.style, this.uniqueHeroTag, this.child, }); @override Widget build(BuildContext context) { if (!isIOS) { return Scaffold( appBar: AppBar( title: Text( text, style: style, ), ), body: child, ); } else { return CupertinoPageScaffold( navigationBar: CupertinoNavigationBar( heroTag: uniqueHeroTag, transitionBetweenRoutes: false, middle: Text( text, style: style, ), ), child: child, ); } } } Lastly add the widget in your tab. In order to test it all work lets create a button to move into a new screen to demonstrate the state is conserved while we navigate. As per top navigation bar, let’s add specific transition effect depening on the device, ie right to left in in iOS and pop up on Android class MyFirstTab extends StatelessWidget { @override Widget build(BuildContext context) { return MyTopBar( text: "Tab 1", uniqueHeroTag: 'tab1', child: Container( color: Colors.redAccent, child: Center( child: RaisedButton( onPressed: () { Navigator.push( context, isIOS ? CupertinoPageRoute( builder: (context) => PurplePage(), ) : MaterialPageRoute( builder: (context) => PurplePage(), )); }, child: Text('Go to test page', style: TextStyle(fontSize: 20)), ), ), ), ); } } class PurplePage extends StatelessWidget { @override Widget build(BuildContext context) { return MyTopBar( text: "Tab 1", uniqueHeroTag: 'purplePage', child: Container( color: Colors.deepPurple, ), ); } } class MySecondTab extends StatelessWidget { @override Widget build(BuildContext context) { return MyTopBar( text: "Tab 2", uniqueHeroTag: 'tab2', child: Container( color: Colors.greenAccent, child: Center( child: RaisedButton( onPressed: () { Navigator.push( context, isIOS ? CupertinoPageRoute( builder: (context) => BlackPage(), ) : MaterialPageRoute( builder: (context) => BlackPage(), )); }, child: Text('Go to test page', style: TextStyle(fontSize: 20)), ), ), ), ); } } class BlackPage extends StatelessWidget { @override Widget build(BuildContext context) { return MyTopBar( text: "Tab 2", uniqueHeroTag: 'balckPage', child: Container( color: Colors.black, ), ); } } Here is the result Top navigation with iOS transition effects Top navigation with Android transition effects Formatting botton and top navigation bar Colouring the top and bottom navigation bar is not entirely intuitive. We need to use 3 facilities to changes colours: CupertinoTabBar MyTopBar CupertinoTheme First you can control the colour of the bottom bar background/icons using the CupertinoTabBar import 'dart:io'; @override Widget build(BuildContext context) { return CupertinoTabScaffold( tabBar: CupertinoTabBar( activeColor: Colors.black, inactiveColor: Colors.lightGreen, backgroundColor: yellowAccent, items: [ BottomNavigationBarItem(...... .... Secondly you can colour the top navigation bar using the MyTopBar you have just created class MyTopBar extends StatelessWidget { final String text; final TextStyle style; final String uniqueHeroTag; final Widget child; MyTopBar({ this.text, this.style, this.uniqueHeroTag, this.child, }); @override Widget build(BuildContext context) { if (!isIOS) { return Scaffold( appBar: AppBar( backgroundColor: Colors.blue, title: Text( text, style: style, ), ), body: child, ); } else { return CupertinoPageScaffold( navigationBar: CupertinoNavigationBar( backgroundColor: Colors.blue, heroTag: uniqueHeroTag, border: null, transitionBetweenRoutes: false, middle: Text( text, style: style, ), ), child: child, ); } } } The only things which remain outstanding is the colour of the back icon in the top bar. The only wait to control that colour I found is to add a CupertinoTheme to the CupertinoApp (please let me know in the comments if you have found a better way of dealing with it). You can use the CupertinoTextThemeData inside the CupertinoThemeData and set the primary colour void main() { runApp( new CupertinoApp( home: new HomeScreen(), localizationsDelegates: const <LocalizationsDelegate<dynamic>>[ DefaultMaterialLocalizations.delegate, DefaultWidgetsLocalizations.delegate, ], theme: CupertinoThemeData( textTheme: CupertinoTextThemeData( primaryColor: Colors.redAccent ), ), ), ); } Here is the result. The app looks like a rainbow … I beleive i have used all the available colours :) Extra functionality In iOS you also have a cool functionality which is that if you are on a certain tab and you click on the bottomNavigation item of that tab, you are basically returned to the first screen of that tab. You can achieve this behaviour by using the onTap property of the CupertinoTabBar class (note that this works only on iOS and not on Android) onTap: (index) { // back home only if not switching tab if (currentIndex == index) { switch (index) { case 0: firstTabNavKey.currentState.popUntil((r) => r.isFirst); break; case 1: secondTabNavKey.currentState.popUntil((r) => r.isFirst); break; case 2: thirdTabNavKey.currentState.popUntil((r) => r.isFirst); break; } } currentIndex = index; }, Have a go and put all this code in your app and add you destination screen. If you run your app you should get something like this Conclusion You have seen it is quite easy to achieve a tab Bar navigation capability in Flutter. This will definitively make your iOS user super happy however please also consider that Android users are not familiar with this navigation paradigm so keep this into consideration when designing your app Please show your support clapping this article if you liked it!
https://medium.com/flutter-community/add-a-tab-bar-and-navigation-bar-with-ios-style-in-your-next-flutter-app-bf97b1e27e3a
[]
2020-06-02 22:01:00.835000+00:00
['Android App Development', 'Mobile App Development', 'iOS App Development', 'Flutter']
Film review: How Coded Bias reveals the racism within algorithms
When Joy Buolamwini, a PhD student at MIT, was conducting facial recognition experiments using artificial intelligence, she ran into one key setback: the technology couldn’t accurately process her face. Investigating further, she found out that these programs struggle to register women more than men, and also have a very difficult time identifying black or brown faces. When Buolamwini placed a white mask over her face and again used the facial recognition software, the AI immediately identified what was in front of the camera as a face. She realized the AI she was using had a race problem: Because machine learning is only as robust as the dataset fed into its system, allowing it to recognize objects or people based on the photos already in its “brain,” if the information it’s given is of only white faces, it won’t recognize black faces. And because AI tech is predominantly created by white male scientists and engineers, very few diverse photos are used to develop the foundation of AI datasets. Buolamwini’s story is the main current running through the new documentary Coded Bias, debuting in Canada at the online Hot Docs Film Festival. Directed by Shalini Kantayya, who previously quarterbacked a doc on clean energy, Coded Bias uncovers the dirty secret behind AI, but not just what Buolamwini discovered about facial recognition’s bias. A slew of other algorithm-based technologies push out diverse populations: trained automated risk-profiling systems disproportionately identify Latinx people as illegal immigrants, and credit scoring algorithms disproportionately select black people as risks and prevent them from buying homes, getting loans, or finding jobs. Boulamwini told Frontline in 2019 (a quote which isn’t part of the doc): “When these systems fail, they fail most the people who are already marginalized, the people who are already vulnerable. And so when we think about algorithmic bias, we really have to be thinking about algorithmic harm. That’s not to say we don’t also have the risk of mass surveillance, which impacts everybody. We also have to think about who’s going to be encountering the criminal justice system more often because of racial policing practices and injustices.” That’s why she and other like-minded AI analysts formed the Algorithmic Justice League in order to publicize the racial and gender bias embedded within AI systems. This is the kind of civic action that can be encouraging to those who think certain technologies will always hide in the shadows, their inner workings shrouded in mystery, only to spit out results that the public accepts without question. The film also goes across the pond in London to profile Silkie Carlo, the director of Big Brother Watch, an organization that monitors the use of facial recognition AI by British law enforcement. Carlo explains how civilian civil liberties are violated with this technology, and points out the growing number of citizens being misidentified. For example, Big Brother Watch found that the use of photo biometrics produced 2% identification accuracy for the Metropolitan police force, while South Wales police is only 8% accurate. Coded Bias does a fantastic job in warning us about the sly racism found in technologies that will only become more popular in the coming decade. After all, we have AI tech embedded in Siri/Alexa, camera phones, chatbots, Google Images, etc, and if we want to level the playing field and ensure racism doesn’t creep further into this sector, we can’t just stand still. What I would have liked to see more of in this film, though, is the perspective of those white male scientists creating AI tech for major firms such as Amazon and Google. Are they going to bring more diversity to their datasets? How do they respond to Boulamwini’s discovery? If there is going to be change in this field, the major companies have to own up to their own biases, but we never get to hear them on camera explain their position. This doc is inspiring for those us who have long been interested in AI and its future. But it’s also frightening to recognize how biased this innovation can be, and how we rely on the determination of researchers such as Boulamwini and her Algorithmic Justice League focused on tipping the scales to favour racialized voices who have long been discriminated against offline and now online within AI systems, too. You can still catch Coded Bias on the Hot Docs website by purchasing tickets to stream the film here.
https://davidsilverberg.medium.com/film-review-how-coded-bias-reveals-the-racism-within-algorithms-da8989052470
['David Silverberg']
2020-06-06 12:52:48.473000+00:00
['Machine Learning', 'Anti Racism', 'Artificial Intelligence', 'Documentary', 'Racism']
Which Beatle Had The Most Musical Influence Within The Beatles?
Using Convolutional Neural Networks And Scikit-Optimize To Predict Who Had The Largest Influence Within The Beatles The Beatles’ London rooftop concert, 1969 Personal Background Since the late summer of 2019, I have become a huge fan of The Beatles. It all started when I discovered the conspiracy theory that Paul is Dead and dove deep into the rabbit hole of The Beatles’ music and personal stories. Now, I don’t believe the conspiracy, but I am glad I looked into it as it was a great way to kill time on a train ride to New York City! As a python enthusiast and a Master’s student in Applied Data Science, I am always trying to think of new projects to work on to build both my data science and presentation skills. One day I was listening to the song “Something” off of the Abbey Road album and it popped into my head… “Which Beatle had the heaviest influence on the Beatles discography?” I decided to approach this project by using each Beatle’s post-Beatles discography as the training and validation data for the model to isolate each artist’s musical voice. The entire Beatles discography would be the data that I would use to predict who had the most influence. I knew that this project would be a major challenge for a few reasons. One, collecting the audio data for each song in the discography of each artist, a post-Beatles breakup would not come easily through an API. This would be a challenge because the data would take additional processing and a good chunk of computer memory. Two, I had never worked with audio data before, so this type of modeling would be different than what I have done in the past. Three, analyzing the accuracy of the results when there was no known answer to who had the most musical influence on each and every Beatles song. Ultimately, this project may lead to some interesting insights about the band members. Data Collection Audio data for music can be hard to collect as most artist’s music is licensed to companies such as Spotify, Apple Music, Pandora, etc. To get each song needed for this project, YouTube was going to be the best source. youtube-dl was created to download videos from YouTube and has a python package built into it for extracting videos in python scripts. Using this package, I created two functions to help download all the songs in the album playlists and save them in the proper directory. To get the list of each artist’s albums released after the Beatles’ last day in the recording studio together, I used Wikipedia. When getting the playlist links for each album, I made sure to check the songs on the playlist against the tracklist for the album on Wikipedia. Since downloading all the songs for an artist would take about an hour or more, I decided to create individual download scripts for each artist so I could run one script and build the download script for another at the same time. All of the download scripts can be found at the GitHub link at the bottom of the article and titled “download_[artist]_albs.py”. Here’s an example: Mel Spectrogram image of the song “A Day In The Life” off Sgt. Pepper’s Lonely Hearts Club Data Processing Each song was either saved down as a .mp4, .webm, or .mkv file, but this wasn’t going to matter as I could convert the files into a .wav file using a function that leverages the AudioSegment module of the pydub package. I also wrote a helper script that walks through each artist’s directory and converts the files, while using multiprocessing’s Pool class to run the tasks in parallel. To transform the data into an image for classification in our CNN, we can use the librosa package, which works very well with audio files and has some built-in functions for running short-time Fourier transform and mel-scaled spectrogram. We will use all of the aforementioned functions for converting our audio files into images that we can use in our image classification neural network. To create the images from the .wav files, I’ll create two plotting functions to load and plot the sound frequency of the audio: Then, leverage multiprocessing’s Pool class again to speed up the processing of the images, and save to the respective training and test folders: Now that we have our images via stft and mel-scaled spectrogram processes, we can see which process helps us better train the model. Short-Time Fourier Transform of “Working Class Hero” by John Lennon Model Training To begin training a CNN to classify which song’s image belongs to which artist, we first need to load our training and validation data. Keras provides an ImageDataGenerator class that will allow us to use a method called flow_from_dataframe, to load images in batches based on the file and artist names from a pandas DataFrame object. Since I planned on multiple iterations of model testing, I decided to create a function that creates either the training and validation or test sets of images to use as input to my model: Now it is time to create the convolutional neural network using Keras. To be able to tweak the model to have more convolutional layers, different convolutional activation functions, dropout rates, optimizers, etc. quickly, I created a function. This function builds and compiles a two-dimensional convolutional neural network, and even allows you to pass parameters to the optimizer. The benefit of creating this function, outside of the build speed, is the ability to use it in an optimizer to find the best hyperparameters of a CNN. So there are now functions to retrieve the training and testing images and build our classification model. The next step now involves passing the training data to our model and find the best hyperparameters of the model. To attempt to find the hyperparameters as fast as possible for the neural network, we will use scikit-optimize: Scikit-Optimize, or skopt , is a simple and efficient library to minimize (very) expensive and noisy black-box functions. It implements several methods for sequential model-based optimization. skopt aims to be accessible and easy to use in many contexts. To find the parameters such as batch size, epochs, learning rate, image size, etc., we will use scikit-optimize’s gp_minimize function, which attempts to minimize a function that is passed to it by applying Bayesian optimization using Gaussian processes. Since gp_minimize needs a function to minimize, a function named fitness will be created to create the CNN model and training and validation data and then train the neural network. The function will also test if this version has the highest validation accuracy compared to the previously trained versions, and save the model. You will notice that this function returns the negated validation accuracy since gp_minimize is attempting to minimize the fitness function. Now it's time to put it all together and train and tune the CNN. gp_minimize will run 40 times to attempt to find the best fit of hyperparameters. After running multiple iterations of this code with both the stft and mel spectrogram images, the best validation accuracy found was 67.33% with stft and 69.33% with mel spectrogram. The hyperparameters used with the best mel spectrogram model were as follows: Learning Rate: 1.0 Epochs: 79 Batch Size: 16 Image dimensions: (256, 256) Convolutional Layers: 5 Next, it is time to predict which artist had the most influence on a particular Beatles song. Interpreting the Results First, I load the best model and make predictions on the Beatles data: from model_staging import fetch_images_dataframe import pandas as pd import numpy as np import keras train_df = pd.read_csv("train_df.csv") test_df = pd.read_csv("test_df.csv") train_path = "C://Users//Alec//MyPython//Beatles/train_melspec" test_path = "C://Users//Alec//MyPython//Beatles/test_melspec" # load the Beatles data to predict on test_gen = fetch_images_dataframe(test_df, x_col="song", y_col="artist", directory=test_path, batch_size=16, target_size=(256, 256), class_mode="categorical", shuffle=False, seed=1, save_format="png") # load the model best_model_adam = keras.models.load_model("models/melspec/skopt_best_adamV3.h5") # predict the probabilities for each song probabilities = best_model_adam.predict_generator(test_gen) # get the prediction based on largest probability preds = np.argmax(probabilities, axis=1) # load in the training data to get the artist indicies train_gen, valid_gen = fetch_images_dataframe(train_df, x_col="song", y_col="artist", directory=train_path, batch_size=16, target_size=(256, 256), class_mode="categorical", shuffle=True, seed=1, validation_split=0.2, save_format="png") class_map = train_gen.class_indices # create a dataframe of songs and predictions pred_df = pd.DataFrame(data={"songs": test_gen.filenames, "predictions": preds}) # now convert the prediction column to the artist name mapping = {v:k for k,v in class_map.items()} pred_df["predictions"] = pred_df["predictions"].map(mapping) # merge the pred_df with the test_df in order to bring the album # name in for each song pred_df = pred_df.merge(test_df[["album", "song"]], left_on="songs",right_on="song") pred_df.drop("song", axis=1, inplace=True) # join the prediction probabilities with the prediction dataframe pred_df = pred_df.join(pd.DataFrame(probabilities)) pred_df.rename(mapping, axis=1, inplace=True) pred_df[['Lennon', 'harrison', 'mccartney', 'starr']] = pred_df[['Lennon', 'harrison', 'mccartney', 'starr']].round(4) Let’s take a look at the total predictions per artist: Lennon: 28 Harrison: 21 McCartney: 120 Starr: 47 The model seems to be highly favoring Paul McCartney as the main influence in the discography. It’s also very surprising to see John Lennon so low and Ringo Starr being predicted for almost 50 Beatles songs. This goes wildly against the mainstream knowledge that Lennon was as big of an influence as McCartney, and that Ringo Starr was a rhythm drummer that could have been replaced with any old drummer. Let’s also take a look at the breakdown per album. Revolver and Magical Mystery Tour seem to be the most evenly distributed albums, with Mccartney dominating the White Album, Abbey Road, Sgt. Pepper’s Lonely Hearts Club, and Yellow Submarine. Let’s see if there are any interesting trends of influence throughout the years. Ringo is predicted to have the most influence in 1963 over the course of the Please Please Me and With The Beatles. George Harrison leads the year 1966 when the Beatles only released the Revolver album. Outside of those years, McCartney dominates the remaining years, especially the last four years of the group’s existence. It’s widely known that the rest of the band, especially John Lennon, were getting fed up with Paul’s stranglehold over the songwriting towards the end of the band’s tenure, which ultimately led to The Beatles breaking up. It is interesting to see the model reflects his influence in those years. Conclusion and Lessons Learned Overall, it is interesting to see Paul McCartney dominate the influence, and Ringo Starr is an easy second influence over Lennon and Harrison. I would have to think that this is due to the CNN overfitting to the training data as McCartney and Starr made up 39% and 30% of the training and validation data. This is due to their prolonged careers since they are still both alive as of October 2020. It would be interesting to see how one could normalize the class weights in the model training to ensure the CNN does not overfit the skewed training data. I enjoyed working with audio data for the first time and having to figure out new methods of acquiring and processing the data for model training. Scikit-optimize was also a great help in model training and will definitely be used in the future in favor of scikit-learn’s grid search method. Please share any methods I could have used in data processing or model building to improve the predictions on The Beatles songs! All code can be found on my GitHub. Please reach out to me on LinkedIn if you would like to connect!
https://alecschneider33.medium.com/which-beatle-had-the-most-musical-influence-within-the-beatles-b26340e96acb
['Alec Francis']
2020-10-29 21:18:06.811000+00:00
['The Beatles', 'Python', 'Audio Processing', 'Scikit Optimize', 'Convolutional Neural Net']
Top 40 iOS Swift Questions Solutions
In This Article I have Covered 40 Questions | Swift Programming 1. What is Type Inference ? In short its an ability of swift . You dont always need to write types of variables and constant you making in your code . For example : // swift know its Int type var age = 40 // Int // You dont need to tell them always like below var age : Int = 40 2. What is Generics ? Generic code enables you to write flexible, reusable functions and types that can work with any type, subject to requirements that you define. Understanding it with an example : Suppose you want to swap to values of type Int , lets write a non-generic functions : func swapTwoInts(_ a: inout Int, _ b: inout Int) { let temporaryA = a a = b b = temporaryA }var num1 = 4 var num2 = 5 swapTwoInts(&num1 , &num2) Now , suppose you want to swap two double values or two string values , you will need to write another function for that , because the above function is accepting only for Int type . What if we have a function which accept any type of values and swap them , this is what generic do . Now lets do the same thing with a generic function : func swapTwoValues<T>(_ a: inout T, _ b: inout T) { let temporaryA = a a = b b = temporaryA } var num1 = 3 var num2 = 4 swapTwoValues(&num1 , &num2)var str1 = "sdf" var str2 = "dafdf" swapTwoValues(&str1 , &str2) Now you can swap any type of values , you dont need to write different different function to swap different different type of values. T is a placeholder , called as type parameter . We use array in swift which is also a generic type Array <Element> , Dictionary<Key , Value> 3. What are Protocols ? Its a blueprint of methods, properties, and other requirements that suit a particular task and it could adopted by a class , structure or enumeration . Protocol does not include any implementation !!!! Type which is adopting the protocol , should have all the methods which are present in given protocol . And this action is calledconforming protocol . Its syntax looks like : protocol Vehicle { func accelerate() func stop() }class Unicycle : Vehicle { var peddling = false func accelerate(){ peddling = true } func stop() { peddling = false } } 4. What are Tuples ? Sometimes data comes in pairs or triplets. An example of this is a pair of (x, y) coordinates on a 2D grid. Similarly, a set of coordinates on a 3D grid is comprised of an x-value, a y-value and a z-value. In Swift, you can represent such related data in a very simple way through the use of a tuple. let coordinates: (Int, Int) = (2, 3) 5. What about Mutability in Swift ? Constant(let) are constant in swift and variable(var) varies . 6. What are Subscripts ? With subscripts you can quickly access the member elements of collections. A subscript consists of: The name of the collection, such as scores Two square brackets [ and ] and A key or index inside the brackets By default, you can use subscripts with arrays, dictionaries, collections, lists and sequences. You can also implement your own with the subscript function. subscript(parameterList) -> ReturnType { get { // return someValue of ReturnType } set(newValue) { // set someValue of ReturnType to newValue } 7. What is an Optional ? Optionals are Swift’s solution to the problem of representing both a value and the absence of a value. An optional is allowed to hold either a value or nil. 8. In what ways you could Unwrap an optional ? We can unwrap any optional in following ways : By Optional Binding By Force Unwrapping By Guard Statement By Nil Coalescing Optional Binding (If let) Its simplest way to do unwrap an optional . var authorName : String? = "Mohd Yasir"if let authorName == authorName { print("Author name is \(authorName)") else{ print("No Author Name") } By Force Unwrapping To force unwrap , we use “!” . var authorName : String? = "Mohd Yasir" print("Auhor name : \(authorName!)") Guard Statement Sometimes you want to check a condition and only continue executing a function if the condition is true, such as when you use optionals. Imagine a function that fetches some data from the network. That fetch might fail if the network is down. The usual way to encapsulate this behavior is using an optional, which has a value if the fetch succeeds, and nil otherwise. Swift has a useful and powerful feature to help in situations like this: the guard statement. func testingGuard( _ name : String?){ guard let unrappedname = name else { print("You dont entered any name") return } print("Hello , \(unrappedname)") } Nil Coalescing let name = String? = nil let unwrappedName = name5 ?? "Unkonwn" 9. What kind of memory allocations takes place in Swift ? In short Stack and Heap When you create a reference type such as class, the system stores the actual instance in a region of memory known as the heap. Instances of a value type such as a struct resides in a region of memory called the stack . 10. What is the difference between stack and heap memory ? The system uses the stack to store anything on the immediate thread of execution; it is tightly managed and optimized by the CPU. When a function creates a variable, the stack stores that variable and then destroys it when the function exits. Since the stack is so strictly organized, it’s very efficient, and thus quite fast. The system uses the heap to store instances of reference types. The heap is generally a large pool of memory from which the system can request and dynamically allocate blocks of memory. Lifetime is flexible and dynamic. The heap doesn’t automatically destroy its data like the stack does; additional work is required to do that. This makes creating and removing data on the heap a slower process, compared to on the stack. 11. What is In-Out Parameter ? Function parameters are constants by default, which means they can’t be modified. To illustrate this point, consider the following code: func incrementAndPrint(_ value: Int) { value += 1 print(value) } This results in an error: Left side of mutating operator isn't mutable: 'value' is a 'let' constant A behavior known as copy-in copy-out or call by value result. You do it like so: func incrementAndPrint(_ value: inout Int) { value += 1 print(value) } inout before the parameter type indicates that this parameter should be copied in, that local copy used within the function, and copied back out when the function returns. Ampersand (&) You need to make a slight tweak to the function call to complete this example. Add an ampersand (&) before the argument, which makes it clear at the call site that you are using copy-in copy-out: var value = 5 incrementAndPrint(&value) print(value) 12. What is the difference between synchronous and asynchronous ? Asynchronous means , you can execute multiple things at a time and you don’t have to finish executing the current thing in order to move on to next one. Synchronous basically means that you can only execute one thing at a time. 13. How you could pass data from one ViewController to another ? You can pass data between view controllers in Swift in 6 ways: By using an instance property (A → B) By using segues (for Storyboards) By using instance properties and functions (A ← B) By using the delegation pattern By using a closures or completion handler By using NotificationCenter and the Observer pattern 14. What is Completion Handler in Swift ? A completion handler is a closure (“a self-contained block of functionality that can be passed around and used in your code”). It gets passed to a function as an argument and then called when that function is done. The point of a completion handler is to tell whatever is calling that function that it’s done and optionally to give it some data or an error. Sometimes they’re called callbacks since they call back to whatever called the function they’re in. Example: import UIKitlet firstVC = UIViewController() let nextVC = UIViewController()firstVC.present(nextVC, animated: true, completion: { () in print("Welcome") }) 15. What Compiler Swift Uses ? The Swift compiler uses LLVM . 16. What is Lazy in Swift ? In simple “A lazy stored property is a property whose initial value is not calculated until the first time it is used.” 17. Explain Core Data ? ~AppleDocumentation Its a Framework 💯 Apple Says : “Use Core Data to save your application’s permanent data for offline use, to cache temporary data, and to add undo functionality to your app on a single device.” 😮 Core data gives you these features : Persistence , Undo and Redo of Individual or Batched Changes , Background Data Tasks , View Synchronization , Versioning and Migration etcc.. . Creating a Core Data Model The first step in working with Core Data is to create a data model file. Here you define the structure of your application’s objects, including their object types, properties, and relationships. You can create core data model while creating project by check the box “use core data” . Core Data Stack After you create a data model file , set up the classes that collaboratively support your app’s model layer. These classes are referred to collectively as the Core Data stack . There are few Core Data Components : An instance of NSManagedObjectModel represents your app’s model file describing your app’s types, properties, and relationships. represents your app’s model file describing your app’s types, properties, and relationships. An instance of NSManagedObjectContext tracks changes to instances of your app’s types. tracks changes to instances of your app’s types. An instance of NSPersistentStoreCoordinator saves and fetches instances of your app’s types from stores. saves and fetches instances of your app’s types from stores. An instance of NSPersistentContainer sets up the model, context, and store coordinator all at once. Different Data Types in Core Data Many apps need to persist and present different kinds of information. Core Data provides different attributes, including those common for all databases, such as Date or Decimal type, and non-standard attributes handled with Transformable type. 18. What is Sentinel Value ? A valid value that represents a special condition such as the absence of a value is known as a sentinel value. That’s what your empty string would be . 🙂 19. Automatic Reference Counting (ARC) Swift uses ARC to track and manage your app’s memory usage. ARC automatically frees up the memory used by class instances when those instances are no longer needed , you do not needed to think about memory management . But in few cases ARC requires more information about the relationships between parts of your code in order to manage memory for you . Reference counting applies only to instances of classes. Structures and enumerations are value types, not reference types, and are not stored and passed by reference. I Will PUBLISH an entire article on this topic with more detail ! 20. What is nested optional ? Consider the following nested optional — it corresponds to a number inside a box inside a box inside a box. 21. What are Property observers ? A willSet observer is called when a property is about to be changed while a didSet observer is called after a property has been changed. Their syntax is similar to getters and setters : struct S { var stored: String { willSet { print("willSet was called") print("stored is now equal to \(self.stored)") print("stored will be set to \(newValue)") } didSet { print("didSet was called") print("stored is now equal to \(self.stored)") print("stored was previously set to \(oldValue)") } } }var s = S(stored: "first") s.stored = "second" willSet was called stored is now equal to first stored will be set to second didSet was called stored is now equal to second stored was previously set to first 22. When would you say that an app is in active state ? An app is said to be in active state when it is accepting events and running in the foreground. 23. What is the difference between Viewdidload and Viewdidappear ? Viewdidload is called when it is loaded into memory . Viewdidappear is called when the view is visible and presented on the device . 24. What do you meant by Concurrency ? Concurrency is a condition in a program where two or more tasks are defined independently, and each can execute independent of the other, even if the other is also executing at the same time. 25. Which are the ways of achieving concurrency in iOS ? The three ways to achieve concurrency in iOS are: Threads Dispatch queues Operation queues 26. What is Thread ? According to Apple: “Threads are especially useful when you need to perform a lengthy task, but don’t want it to block the execution of the rest of the application. In particular, you can use threads to avoid blocking the main thread of the application, which handles user interface and event-related actions. Threads can also be used to divide a large job into several smaller jobs, which can lead to performance increases on multi-core computers.” 27. What is Dispatch Queue in Basics ? According to Apple: An object that manages the execution of tasks serially or concurrently on your app’s main thread or on a background thread. 28. Difference between Foreground and Background? The foreground contains the applications the user is working on, and the background contains the applications that are behind the scenes . 29. Classes Classes are reference types, as opposed to value types . You create a class like : class Person { var firstName: String var lastName: Stringinit(firstName: String, lastName: String) { self.firstName = firstName self.lastName = lastName } var fullName: String { return "\(firstName) \(lastName)" } }let john = Person(firstName: "Johnny", lastName: "Appleseed") In Swift, an instance of a structure is an immutable value whereas an instance of a class is a mutable object. Classes are reference types, so a variable of a class type doesn’t store an actual instance — it stores a reference to a location in memory that stores the instance . Here Comes Stack vs Heap ! When you create a reference type such as class, the system stores the actual instance in a region of memory known as the heap, while instances of a value type such as a struct resides in a region of memory called the stack . The system uses the stack to store anything on the immediate thread of execution; it is tightly managed and optimized by the CPU. When a function creates a variable, the stack stores that variable and then destroys it when the function exits. Since the stack is so strictly organized, it’s very efficient, and thus quite fast. The system uses the heap to store instances of reference types. The heap is generally a large pool of memory from which the system can request and dynamically allocate blocks of memory. Lifetime is flexible and dynamic. The heap doesn’t automatically destroy its data like the stack does; additional work is required to do that. This makes creating and removing data on the heap a slower process, compared to on the stack. Working with Reference var homeOwner = john john.firstName = "John" john.firstName // "John" homeOwner.firstName // "John" Sharing among class instances results in a new way of thinking when passing things around. For instance, if the john object changes, then anything holding a reference to john will automatically see the update. If you were using a structure, you would have to update each copy individually, or it would still have the old value of “Johnny”. Identity Operators Because classes are reference types, it’s possible for multiple constants and variables to refer to the same single instance of a class behind the scenes. In Swift, the === operator lets you check if the identity of one object is equal to the identity of another: john === homeOwner // truelet newInstance = Person(firstName: "Johnny", lastName: "Appleseed")newInstance === John // false 30. What is MVC ? MVC stands for Model View Controller. Models represent application data; views draw things on the screen; controllers manage data flow between model and view. Model and view never communicate with each other directly and rely on a controller to coordinate the communication. 31. What is @State ? If you assign @State to a property, SwiftUI will monitor this property and, if it is mutated or changed, will invalidate the current layout and reload. No need to invoke a refresh call (or a reloadData(), as you might have previously seen in CollectionViews and TableViews). 32. What are Modifiers ? These are a way of rendering custom interactions and decoration . font() , background() , and clipShape() are the some some examples . 33. What is Nesting Syntax ? A simple example to that is nesting a list inside a navigation view . 34. What is Grouping in SwiftUI ? Suppose you have written following code : VStack { Text("Line") Text("Line") Text("Line") Text("Line") Text("Line") Text("Line") Text("Line") Text("Line") Text("Line") Text("Line") } That works just fine, but if you try adding an eleventh piece of text, you’ll get an error like this one: ambiguous reference to member 'buildBlock()' This is because SwiftUIs view building system has various code designed to let us add 1 view, 2 views or 4 , 5 , 6 , 7, 8, 9, and 10 views, but not for 11 and beyond , that doesn’t work . But we can do this : var body: some View { VStack { Group { Text("Line") Text("Line") Text("Line") Text("Line") Text("Line") Text("Line") } Group { Text("Line") Text("Line") Text("Line") Text("Line") Text("Line") } } } That creates exactly the same result, except now we can go beyond the 10 view limit because the VStack contains only two views – two groups. 35. What is Combine ? Combine is Swift’s own version of Reactive Streams, and it enables objects to be monitored (observed) and data to be passed through streams from core application logic back up to the UI layer . 36. What HashFunction Swifts Dictionary Uses ? Swift uses the SipHash hash function to handle many of the hash value calculations. 37. What is init() in Swift ? Initialization is a process of preparing an instance of an enumeration, structure or class for use. 38. What are the control transfer statements that are used in iOS swift ? Return Break Continue Fallthrough 39. What is a delegate in swift ? Delegate is a design pattern, which is used to pass the data or communication between structs or classes. Delegate allows sending a message from one object to another object when a specific event happens and is used for handling table view and collection view events. 40. What is Optional chaining ? Optional chaining is a useful process which we can use in combination with the optional to call the methods, properties, and subscripts on the optionals and these values may or may not be nil. In this process, we may try to retrieve a value from a chain of the optional values.
https://medium.com/datadriveninvestor/top-40-ios-swift-questions-solutions-d038c7f20a48
['Mohd Yasir']
2020-12-27 16:04:56.324000+00:00
['Development', 'Swift', 'Programming', 'iOS', 'Interview Questions']
Unearthed Crypto Gem Poised for Resurgence
Top of The Charts One of RedFOX Labs’ standout attributes is its mission to make real-world products. As a venture builder, it’s in the project’s DNA to replicate revenue-generating businesses. And the project’s gaming studio, RFOX Games, just recently hit the start button. The studio’s first foray into video games — Keys to Other Games — has become a blockbuster hit in its own right. The funny thing about that is there’s not even a playable game yet. For now, it’s all about buying, selling, and trading NFTs in anticipation of what’s ahead. But this month aims to change all that. Per the KOGs Roadmap, December is bringing the game’s beta version, a staking program, plus the dynasty’s 2nd Edition NFTs. After launching in August, the game’s collectible tokens quickly shot to the #1 position on the WAX Blockchain’s most prolific secondary marketplace, AtomicHub. AtomicHub sales figures as of November 21, 2020 At the time of writing, KOGs sales are nearly 2x that of its nearest competitor. And it’s easy to see why collectors are going ape over the 1st Edition. For starters, there’s a game right around the corner — a digital remake of controversial schoolyard classic, POGs. While most other NFTs are only for staring at, KOGs — geographic laws allowing — let players challenge foes in winner-take-all online matches. There’s gambling. And tournaments. Best of all, as the name screams, KOGs are tokens that unlock in-game items like skins, plus grant access to special levels and competitions. Not to mention, these NFTs are gorgeous. Feast your eyes on this ultra-rare stunner: KOGs — Keys to Other Games —NFTs trade on AtomicHub Once the game launches and a new audience realizes there’s a 1st Edition to collect, we may very well see a serious spike in demand for 1st Ed. KOGs. NFTs perfectly encapsulate the main benefits of running video games on top of blockchain networks. Blockchain gaming items are immutable, easily authenticated, and owner-transferrable across supported titles. Coupled with the ability to stonewall cheating fraudsters, blockchain gaming is the future mainstream gaming doesn’t yet know it needs. No, decentralized gaming isn’t as polished and shiny as centralized gaming on a console or PC. But self-ownersghip of in-game items is, to wear out the pun, a game-changer. But RedFOX certainly isn’t a one-trick pony. Blockchain gaming is but one piece of an oversized puzzle …
https://medium.com/datadriveninvestor/unearthed-crypto-gem-poised-for-resurgence-6e27924048b3
[]
2020-12-27 04:42:06.807000+00:00
['Venture Builder', 'Cryptocurrency', 'Blockchain', 'Southeast Asia', 'Startup']
Reviewing A/B Testing Course by Google on Udacity
Reviewing A/B Testing Course by Google on Udacity Read to find out how A/B tests are performed at Google. A/B tests are online experiments which are used to test potential improvements to a website or mobile application. This experiment requires two groups — control group and experiment group. Users in the control group are shown the existing website whereas users in the experiment group are shown the changed version of the website.The results are then compared and the data is analyzed to see if the change is worth launching. The A/B testing course by Google in association with Udacity explains the various metrics that need to be considered for analysis , how to leverage the power of statistics to evaluate the results and finally whether the change must be launched or not. The course does not delve deeper into the statistical concepts instead explains the business application of these tests. A/B tests are used extensively by various companies. For example, Amazon used it to test user recommendation feature , Google uses it so extensively that they had once used it to test 41 different shades of blue in UI! Albeit the term is recent the method has been in practice since a very long period of time. It has been used by farmers to see what method yields the best crop or by doctors for clinical trials to check the effectiveness of a drug. Credit : Unsplash Policy and Ethics : Before running A/B tests one must consider a few things. Risk : What risk is the participant undertaking while participating in the test. Minimal risk is defined as the probability and magnitude of harm that a participant would encounter in normal daily life.If the risk exceeds this then the participant must be informed about it. Benefit : It is important to be able to state what the benefit would be from completing the study. Alternatives : what the other alternative services that a user might have, and what the switching costs might be, in terms of time, money, information, etc. Data sensitivity : It refers to the data being gathered , whether it would reveal the identity of the person , security measures taken to safeguard the data and the implications it might have on a person if the data becomes public. Choosing and Characterizing Metrics : The metrics can be divided into two categories. Invariant metrics are the ones that should stay the same across both the groups. For instance , the number of people in both the groups ,distribution based on demographics. Evaluation metrics are the ones that you will use to measure the effectiveness of your experiment. For example, if you are testing for a change in the number of users who click on “start button”, click through probability (number of unique visitors who clicked / total number of unique visitors) could be your evaluation metric. Sanity check is performed to check that the invariant metrics you have chosen are correct. A/A tests can be performed to check this. Sensitivity and robustness must also be considered. The metric should neither be too sensitive nor too robust. Mean can be too sensitive to outliers whereas a median may be too robust to capture the change. 90th or 99th percentiles are considered as good metrics to notice the change.To find a good balance between sensitivity and robustness one can perform A/A tests , use data from previous experiments or perform retrospective analysis of logs. Designing an experiment : Unit of diversion is used to define an individual person in the experiment. User id : If a person has logged in to the account , we can use user id as unit of diversion to track the activities. : If a person has logged in to the account , we can use user id as unit of diversion to track the activities. Cookie : It is a small piece of data sent from a website and stored on the user’s computer by the user’s web browser while the user is browsing.Cookies are browser specific and can also be cleared by the user. : It is a small piece of data sent from a website and stored on the user’s computer by the user’s web browser while the user is browsing.Cookies are browser specific and can also be cleared by the user. Event based diversion : Any action such as reloading a page can be considered as an event. These are mainly used for non user visible changes. For example, latency change. Population : You must also consider the population on which you will run the experiment, whether it would be the entire population , people from a specific region or people from a specific sector. Size : The size of the population is also an important factor. It is influenced by parameters like significance level (alpha), sensitivity(1 — beta) and whether the unit of analysis is equal to unit of diversion. Duration and Exposure : It refers to the time period for which you want to run the experiment. Also , it is very important to determine when to run the experiment. For example , on weekends or weekdays , holiday season or non holiday season. Generally, it is good to balance between the two so that you understand the trend and seasonality effect better. Exposure is basically the fraction of population to which you want to expose the new feature. Analyzing results : Sanity check : After you have the results of the experiment with you , the first thing you have to do is check whether your invariant metrics are the same across both the groups, like whether the population distribution was done correctly. Single metric : To check if a single invariant metric is within the acceptable range you will have to calculate the pooled standard error and multiply it with Z- score (1.96 for 95% CI) and find the margin of error. Then find the lower and upper bounds and check if the difference in values for a particular metric is within the range. You should also perform tests on evaluation metrics to check if the result is both statistically and practically significant. For a result to be statistically significant the range of the difference in values must not contain 0 and for a result to be practically significant the range should not contain the practical significance boundary. Also, to double check the result a sign test may be performed. If the two tests do not agree with each other we might have to look deeper into the data as it might be due to Simpson’s paradox(individual subgroups are showing stable results but their aggregation is causing the problem). Multiple metrics : If you are considering multiple metrics at a time , it is possible that you might be seeing one of the metric as significant because of a false positive. In order to deal with this you may use bootstrapping or bonferroni correction. After the results have been analyzed you have to answer a few questions like is the change significant ? Do I understand the effect of this change? Is the change worth launching after reviewing other business factors? Based on this you may either launch the change , perform some more tests or may not launch the change. You may need to perform A/A tests in pre and post periods for sanity checking and to see the effects of changes on users. Conclusion : A/B testing course by Google on Udacity is a must for anyone who wants to understand the process of A/B testing. The project at the end can further help you in understanding the concepts. This is just an overview of the course.
https://towardsdatascience.com/reviewing-a-b-testing-course-by-google-on-udacity-2652b2235330
['Suyash Maheshwari']
2020-05-12 12:16:22.389000+00:00
['Technology', 'Google', 'Data Science', 'Business', 'Testing']
How Do We Fight Mental Health Stigma?
Education on mental health can be through films. Mental health problems are very common. In 2014, about 1 in 5 American adults experienced a mental health issue, 1 in 10 young people experienced a period of major depression, and 1 in 25 Americans lived with a serious mental illness, such as schizophrenia, bipolar disorder, or major depression. Depression is one of the main global health challenges. The World Health Organization forecast that in 2020 depression will be the main cause of disability worldwide. It is incredibly common, but we rarely talk about it because of the misunderstanding and stigma surrounding mental health. This stigma can lead to rejection, abuse, and exclusion of people from health care or support. This week, Demi Lovato announced that her upcoming Tell Me You Love Me tour will include mental health counseling before the shows. Lovato has been dedicated to reducing the stigma surrounding mental illness in part by being open and candid about how therapy and rehab have helped her. Mental health matters to everyone. We can all take the pledge to learn more about mental illness, to see a person for who they are and act on mental health issues by spreading the word and raising awareness. How can we raise awareness to reduce the stigma surrounding mental health issues and allow people around the world to feel more confident in seeking help for mental illness? It is not as difficult as you imagine. There are different ways to impact mental health awareness. Here are some ideas: TALK ABOUT MENTAL HEALTH. Ask people you know how they are doing. Be ready truly listen and not judge. Sometimes, people just need someone to listen to them. SHARE YOUR STORY. Tell friends and family about what you are going through. Your story might be the push that encourages others to ask for help. WRITE A BLOG ABOUT MENTAL HEALTH. Include the most up-to-date statistics and easy to understand facts on mental health and include information on where people can go for assistance. TAKE CARE OF YOUR OWN MENTAL HEALTH. Volunteer in your community. You will improve your mental health as you work to help other people. LEARN MORE ABOUT MENTAL HEALTH. Educate yourself about mental health, mental illness and suicide. Learn about the signs and symptoms and where to obtain help in your area. Education on mental health can be through films. A group of medical students is leading the efforts to raise awareness and reduce the stigma surrounding mental illness. In February 2018, the Mu Sigma Phi Fraternity, the first medical fraternity in the Philippines and Asia, is organizing the 3rd Quisumbing-Escandor Film Festival (QEFF) for Health. This year’s QEFF will focus on a subject that has long remained in the shadows of our consciousness — mental health. QEFF 3 will feature documentaries and short films that highlight the humanity of mental illness through its social, cultural, and personal dimensions. QEFF 3 will use these films to highlight and articulate the issues on a larger scale, to enlighten people, initiate collective action, and eventually, reduce the stigma. This year, QEFF3 is proud to have 73 inspiring and thought-provoking entries from all over the Philippines. It will also feature an exhibit of entries from Imahe, a photo contest showcasing different perspectives on mental health. To further catapult mental health awareness into the communities, the film festival will be followed by the QEFF Film Caravan wherein the films will be screened in schools, organizations, and barangays across the country and potentially to other countries. More information on the Film Festival here. Everyone is invited; admission is FREE.
https://medium.com/thrive-global/how-do-we-fight-mental-health-stigma-194d83ff952c
['Dr. Melvin Sanicas']
2018-01-31 21:44:41.654000+00:00
['Mental Illness', 'Mental Health', 'Weekly Prompts', 'Depression', 'Film Festivals']
Growth, Shmowth, I Want More Free Time and More Sleep!
One thing about COVID, boy can you cram all those ZOOM MEETINGS back to back to back. With no time in between to do the assignments I take on IN the meetings. Somehow I have it in my mind that 2021 being a new year right after a bit of a holiday break-hope! Hope!-would be a great time to make some course corrections. To really focus on the big priorities of my life. So first of all, what are those big priorities in my life? Cause If I am not clear about them, the cream won’t rise to the top. I once wrote an article: If I’m spread so thin, how come I’m still fat? All kidding aside, I’m just a girl who can’t seem to say no, who ends up taking more and more on — does that make me a mor-on? I like that. I might spell it more-on…but it helps me remember that over-committing is not a smart thing for me to be doing. If my new expression works for you, you’re welcome to borrow it! Moronic or not, I can say it’s a recipe for burnout, anxiety, and depression. It makes me want to throw in the towel and not do anything. Yet, that is not the answer. What is? What I am learning is I need to be crystal clear on my priorities and to know why they are my priorities. MY personal, from-my-heart priorities. Not other people’s priorities for me. Or that which I tell myself I can’t stop doing, because either someone will be mad, or it won’t get done. I wonder if I have an ego trip going that gives me kudos for being the Jill of many trades or the juggler extraordinaire keeping multiple balls in the air. Yo! Have you heard of gravity? Sooner or later, those balls crash to the ground. And me with them? That’s what it feels like sometimes. What’s my priority, and why? The two biggest areas of activity in my life are “all things writing” and “all things church.” I word them this way to remind myself that within each, there are many sub-components. The question is, what is the cream in each that needs to rise to the top? In my writing life, the cream is my novel. While I love all my writing projects, if I could only do one, this would be it. Set in a heartland church with quirky characters, it combines humor, spirituality, and social satire. I’m envisioning it as a series called Too Sexy for my Church. I’m currently editing it with support from a mentor and workshop. Progress is slow but steady. In one 6 month stint, I edited 5 chapters, which seems like a lot. But with 35 chapters, at that rate, it’ll be 3 years. That’s too slow. The editing needs to be done so I can get on with querying potential agents. So this is urgent. In addition, I blog on Medium, write and perform poetry, compile post series into potential ebooks, and do my best to maintain a half-way decent presence on social media related to my writing. My to-do list is much longer than my done-do list. I easily get overwhelmed, unable to figure out how to use an available 30-minute slot except for my Medium writing. This piece is being written in 15 and 30-minute windows of time…Fortunately, I can and do do that! I don’t seem to know how best to use my time and get the other things done. I seem to keep them inching along, but it’s harder to reach completion and move on. My church life is at least this complex. My heart work there is helping craft worship and praying with folks as a Prayer Chaplain. This feels like a sustainable as well as significant contribution. I also serve on our Board of Directors. This year as the Board President. What a great experience! The current Board is a dream team; we work together well. I’ve grown as a leader, learning so much! People appreciate how I hold the space and our work — routine or complex — as holy. When you add all three areas together, it’s a lot. Is it “too much? If that was my primary avocation, maybe not. When you add the writing in, maybe so. There’s a lot of cross-pollination because much of my writing is spiritual, and much of my church work is written. But there are only so many hours in a day. I need more of them to be spent sleeping. I’ve been treating church like it’s my day job even though I am not getting paid. And writing as my hobby, even though I’m making some money, looking to make more. What if I switched the two? Made writing the day job, and church the volunteer hobby. What would that look like? How would it be different? What would it take to do? This brings me to the big decision, which is the prompt after all. 2021 needs to be more focused, better rested, less stressful, and more productive. It seems that doing fewer things with more care and attention is the way to go — quality over quantity. For that to happen, something has to give. Or at least shift. Today I tried on the switch idea, just for practice. That meant putting boundaries on my writing time and treating it as sacrosanct. That meant telling folks I would be busy till 5 pm, but available for quick questions between 2 and 3 pm. They worked with that. For my part, I had to focus on my writing and not trip about church issues during that time. That’s the growing edge for me. In the past, I would handle the church business first because it nagged at me, distracting me while writing. I needed the peace of mind that addressing it first and clearing the deck brings. Going forward, can I focus on writing while church issues loom? And can I lighten my overall load? In a previous article, I shared how my watchword is prune. (LINK) What can I prune from my list of meaningful commitments? Lightening my load allows me to focus on priorities and feel a sense of accomplishment. And hopefully, enjoy more restful sleep. Insomnia is the deal breaker that impacts everything I do. So let’s make adequate sleep the number one priority for 2021! Any suggestions?
https://medium.com/know-thyself-heal-thyself/growth-shmowth-i-want-more-free-time-and-more-sleep-403380787ac5
['Marilyn Flower']
2020-12-23 11:36:20.310000+00:00
['Self', 'Mental Health', 'Life Lessons', 'Time Management', 'Self Improvement']
Exposing the Model Minority Myth: Its Implications on the Self and Others
Exposing the Model Minority Myth: Its Implications on the Self and Others Madison Estrella Follow Jul 31 · 6 min read Asian Americans are often praised for overcoming adversity and for working hard to be successful in achieving the “American Dream”. They are seen as intelligent, law-abiding, diligent, and persevering. But how is it that every single individual that identifies as Asian American can have these traits? These stereotypes encapsulate the Model Minority Myth, the belief that Asian Americans are more successful than other minority groups. This generalization attempts to paint a favorable picture of immigrant success, yet creates a singular narrative that ignores the rich diversity of the numerous Asian American cultures. As an individual who identifies as Asian American, this phenomenon is something that I have personally encountered my entire life without truly knowing the extent of its detrimental nature. Having experienced these stereotypes, it has been difficult to grapple between the pre-existing notions that society holds versus the values and identities that I hold, especially when they are non-congruent. Examples of the Model Minority Myth can be seen in every aspect of life, and these stereotypes are especially evident in the classroom. In many instances, being Asian is equated with automatic intelligence and success. From my personal experience, the Black student that got 100% on his quiz was called “Asian,” and even the Hispanic girl that got an A on her assignment was deemed “Asian,” too. However, when I, a Filipina, got an A-, I was told: “How are you even Asian?” Every time Asian Americans succeed or get a good grade, all of our hard work gets washed away by one simple phrase: “It’s because you’re Asian”. Such a pithy response to Asian success completely diminishes our work and effort that led us to our achievements. It feels like a slap in the face to us Asians, as well as our immigrant ancestors who went through numerous hardships to get us where we are today. Being expected to always excel, in order to be worthy of our race, puts lots of pressure on Asian Americans to try to uphold the expectations of success. Some attempt to uphold society’s image of success by working themselves tirelessly and excessively, while others are left to question their identity and worth when they fall short of stereotypical standards. The unrealistic expectations of societal excellence imposed on Asian Americans is oftentimes burdensome and takes a toll on their emotional well-being. With the Model Minority Myth associating the Asian ethnicity with success, people somehow feel they have the right to revoke our identity, simply because we didn’t conform to their absurd expectations of perfection. Nonetheless, we all worked hard for the grades we got. It’s just that for Asians, society invalidates our efforts by painting success as an innate quality of our ethnicity. Because of the Model Minority Myth, our imperfections as human beings become inexcusable simply because we are Asian– but newsflash: Asians are humans too. We feel societal pressures and burdens like the boulder on Sisyphus’ back– we try to climb up the mountains of society, seeking recognition and acceptance, only to fall short of the unattainable Model Minority status that mocks us. We force ourselves to fit into the American-prescribed characteristics of Asian-Americans– smart, hardworking, successful– and their prestigious occupations, even if it may not describe us or perfectly fit our interests. And most of all, we feel frustrated with ourselves for not being “smart enough” to be Asian. Despite living in our own bodies and experiencing our own hardships, we may not feel like ourselves because of society’s stereotypes. “Some [Asians] are super achievers, most are average citizens, and a few are criminals. They are only human. No more and no less.” — Phillip Chie Societal pressures from the Model Minority Myth, cultural pressures from the home, and the overall struggle to find balance between our Asian and American identities frequently leads to mental health concerns. However, the strong stigma surrounding mental health in Asian American communities prevents them from seeking the help they need. In fact, the cultural stigma around mental illness exists in all communities of color, preventing people from vocalizing their needs and from seeking services. Asian Americans, in particular, are 3 times less likely to seek out mental health services than any other American. This can largely be attributed to the fact that in immigrant and Asian households, mental health largely goes unacknowledged and the Model Minority Myth promotes to society a false image of our invincibility and strength. What many people don’t realize is that the Model Minority Myth is a double edged sword. It not only erases the hard work of Asian-Americans, but it also suppresses other people of color– particularly the Black community. The term “Model Minority” suggests that Asian Americans are exemplary people of color. This in turn implies a hierarchy of race that turns people of color against one another despite us all being victims of white supremacy. By stereotyping Asians as the “successful minority,” the inequalities and racism experienced by other minority groups are invalidated. In essence, the Model Minority Myth suggests to other people of color, “If Asians can do it, why can’t you?”; however, this mentality ignores the numerous systemic barriers that barr the success of the Black community and other people of color. To dismantle this myth, we must refrain from complying with the racist narrative that is ingrained into society via the Model Minority Myth. By acknowledging its harmful implications, we can recognize how these generalizations control our lives, actions, and mental health, as well as how it turns us against other communities of color. In acknowledging the roots of this stereotype, we can then challenge its oppressive influence by actively deciding to live by our own terms and being there for our neighbors. As Asian Americans, we need to be careful about how we internalize racial stereotypes such as the Model Minority Myth. We, like other people of color, need to remind ourselves that we are more than what society perceives us to be. We have the right to be proud and satisfied with the hard work we have put in and the resiliency we have shown in the face of hardship. “I don’t think people understand the model-minority stereotype is negative. You are boxed in. You have to untangle that to find your own path.” — Eddie Huang Most of all, we don’t need society’s approval to be happy, and it is important to make sure you reach out for help when you begin to feel overwhelmed. Chat with friends or family about your mental state and don’t be afraid to reach out to a professional. When it comes to work and school, make sure you recognize your limits and acknowledge what your efforts aim to accomplish– Is it for your personal gain? Or to please others and live up to society’s expectations? This is your life, so don’t let other peoples’ expectations dictate how you live it. So, in an unfair society that sees race before the soul, let’s be there for one another. Let’s support and acknowledge the successes and efforts of people both inside and outside our community, and offer our sympathy and compassion to those around us.
https://medium.com/joincurio/exposing-the-model-minority-myth-its-implications-on-the-self-and-others-e5ca28d57a94
['Madison Estrella']
2020-07-31 16:31:00.896000+00:00
['Model Minority', 'Mental Health', 'Asian', 'Asian American', 'Model Minority Myth']
Why did the impact of Muslims on the world decline?
So, here’s something that was on my mind recently. Islam is arguably the most holistic religion in the world. It provides insights and guidance for Muslims across a broad range of topics relating to all areas of life. Whether it is our income, our food or how we treat people. And the Qu’ran, is central to all of that. This is something we all know, and are pretty proud of, right? Another thing most Muslims know and are proud of is the fact Islam was the catalyst for some of the biggest innovations in the modern world in our Golden Age, which spanned from the 9th Century to 16th Century, changing the landscape of the world as we know it. Some of these innovations included algebra, optics, architecture, geometry, Islamic arts, calligraphy, university, knowledge translation, astronomy, women’s rights, cheques and money, even that staple we all rely on for work, coffee! ☕ (Think about that next time you’re ordering your caramel macchiato with whipped cream!) So yeah, all this is awesome and helps us feel proud of our heritage and history, etc etc. Brilliant. 👍🏼 But this is exactly what was on my mind. What exactly happened to us? Why is that no longer the case? When did the innovation and excellence stop — and more importantly, how do we bring it back? In the last 500 years the Islamic world has unfortunately seen a stagnation (to put it mildly) in terms of socio-economic and scientific progress. Once upon a time, the richest man in the history of the world was a Muslim, but not today. Once upon a time, the greatest healer in the world was Muslim. Not today. Once upon a time, the greatest general in the world was Muslim. Not today. The greatest explorer… You get the idea. There’s 1.6 billion of us (give or take a few), so we’re not exactly short of numbers… Thinking more deeply about it, I don’t think it’s a coincidence, that the time they thrived was when the attitudes to learning were very, very different to the way they are today. Then, they considered all forms of seeking knowledge a form of worshipping God — and it led them to major discoveries as they sought to excel and go as far as possible due to their love of Allah. They saw learning anything beneficial as worship — period. They were inquisitive, driven and were motivated by following the Qur’anic instruction to ponder, reflect and use their intellect in a way that would directly benefit mankind. Today, we have separated science from the spiritual. We treat secular knowledge differently from sacred knowledge. Faith and reason are seen as mutually exclusive. Our love of learning is not what it once was — and as a result, despite our large numbers we are nowhere near those glory days. We honestly believe this is in direct conflict with our purpose and Allah’s direction set out for us in the Qur’an which is to become someone who seeks knowledge from the cradle to the grave. Hence KNOW exists to revive that yearning for knowledge that has always meant to be our purpose — so that we can become the best people we can be, and to serve Allah and humanity at large as we did in the Golden Age — not just ourselves or desires. With this context in mind, we are pleased to introduce our attempt to address this directly — The Second Golden Age. So, what is the Second Golden Age? The #SecondGoldenAge project by KNOW is a series of videos, articles, courses and documentaries not only reconnecting us with what once made us great, but also charting the way forward to put in place the road map for a Second Golden Age, insha’allah. We will do this initially through the lens of our Science category and looking at the huge (but little-known) impact the Quran has had on modern day science — but also looking at how Islam was directly responsible for changing the world across so many disciplines, including medicine, engineering and even trade. Through this series of videos, articles and courses, we intend to revive the spirit of the first Golden Age by not only producing and showcasing content that remind of us their great legacy, but how to infuse that into today to bring about a version 2.0, insha’allah! 😊
https://medium.com/kn-ow/why-did-the-impact-of-muslims-on-the-world-decline-7e343f821cd4
['Faisal Amjad']
2020-06-23 13:08:06.373000+00:00
['Golden Age Of Islam', 'Islam', 'Innovation', 'Greatness', 'Science']
Java Synchronization- part 1
Before going to synchronization, I’ll explain about the multiple threads using a simple code. image 1 The First-class is going to be the class “Countdown” while the class, “ThreadColor” with the color trick, will look like this. public class ThreadColor { public static final public static final String ANSI_RED = "\u001B[31m"; public static final String ANSI_GREEN = "\u001B[32m"; public static final String ANSI_YELLOW = "\u001b[33m"; public static final String ANSI_BLUE = "\u001B[34m"; public static final String ANSI_PURPLE = "\u001B[35m"; public static final String ANSI_CYAN = "\u001B[36m"; public static final String ANSI_WHITE = "\u001b[37m"; } image 2 Here… I have created the second class which extends thread. image 3 Head to the main method, since we have created 2 classes, and then starting two threads. From switch statement (image 1) that we have got from one thread named “Thread 1” which will print text in cyan and the 2nd thread which will print text in purple. Here… now let's see what's happening. image 4 Here you can see, thread 1 in CYAN while thread 2 is in PURPLE. But we can not predict how the result would be, ie. the order of the two colors. You can see that when we run the code a few times, it will give us completely different logs. Now, we are going to add the instance variable, “Private int I;” that changes the local variable “I” to an instance variable. Let’s look at the new result. image 5 Now we can notice, the result is different this time. Instead od=f each thread counting down to 10, the numbers seem to be sort of duplicated. Why?? As you can see, number 10 is duplicated. Not only 10, you can see a few other numbers too have been duplicated. The only thing we did was changing the local variable to an instance variable as follows: class Countdown { private int i; Heap : this the memory of the application that is shared by all the threads and each thread has a Thread Stack. That is the memory, which is unique to each thread. Simply speaking, any thread can’t access to another thread’s stack. But the threads can access the Heap. The local variable is stored in Thread stack, that is each thread has its own copy of the local variable. Whereas, the instance variables are stored in Heap. So when multiple threads are working together with the same object, they are likely to share the same object. In that case, if one thread changes the value, another thread will take the new value that was changed by the first thread. Similarly, when “i” was acting as the local variable, threads had their own version of “i”. Once we changed “i” as an instance variable, the 2 threads had to access this shared resource which is stored in the Heap. That’s why each thread seem to skip some numbers. for statement It decrements I and checks for the condition i>0; the idea behind the “for statement” is, it consist of several steps such as decrementing, checking etc. Therefor the thread can be suspended in between the step. It can be suspended after decrementing “i”, before checking the condition or immediately after executing all the codes and then the printing. Decrementing “i”, checking the condition & printing out the values: these 3 steps can make the current thread , to be suspended. As you can understand, in the first attempt, both the threads considered the value of “i”as 10, so the thread 1 prints 9 but the thread 2 prints 8. While thread 1 was executing the for statement, thread 2 must have been without surpassing thread 1. Therefore it takes the value of “i” as 9 and execute the for block and prints 8. I think this explanation gives you an idea about this. When accessing the same resources, we have to go through this situation and it is called “Thread interference” . it is also referred as “Race condition”. Always keep in mind that this is a serious issue when it comes to writing or updating the same resource. We can do this without skipping numbers or avoiding the interference, ie. pass the same Countdown object to both the threads. public class Main { public static void main(String[] args) { Countdown countdown1 = new Countdown(); Countdown countdown2 = new Countdown(); CountdownThread t1 = new CountdownThread(countdown1); t1.setName("Thread 1"); CountdownThread t2 = new CountdownThread(countdown2); t2.setName("Thread 2"); t1.start(); t2.start(); } } Take a look at the above instance, where there are two new objects for the threads, not sharing the Heap. Check the results, you don’t see any interference! Each thread Counts down from 10 to 1 successfully. image 6 The real question is, will this be applicable in Real-world? Will this work for a bank account, where someone deposits money in your account while you are withdrawing some money from the ATM? Therefore it appears that we have to use the same object in order to maintain the integrity of the data. Because this is the only way that lets us know the exact bank balance after executing several threads (transactions), right? There can be several threads that are waiting to change the bank balance, simultaneously. So we need to allow multiple threads to change it while preventing the “Race condition”.
https://medium.com/swlh/java-synchronization-part-1-abcabac56cf7
['Sachintha Hewawasam']
2020-06-29 19:28:50.824000+00:00
['First Post', 'Multithreading', 'Java', 'Programming', 'Synchronization']
Our World Needs a Reset Button or Maybe a Rewrite #7
Our World Needs a Reset Button or Maybe a Rewrite #7 A poem that places a face on Coronavirus My newest short story’s main character is a mutated jellyfish, sporting fangs He slithers on land Sits beside me when I write Follows me on long hikes — I am of the supporting cast This siphonophore plays the villain Using his tentacles to Deposit mold spores Where I walk and on my food Turning sushi, prime rib and Maryland blue crab into Unhealthy delicacies I crave but must resist Microscopic mold buds Infest humans, imitating a cold Cause weakness and death If I allow them to be a part of the story However, I am the author who Cannot allow the protagonist to Control the final outcome He lives between the lines My hands freeze unable to Alter the climax The resolution falling To an action that requires revision With a slash of my pen Verbs decline, adjective multiply His influence diminishes His barbaric yelp silenced Granting the reader hope Redemption and grace To be realized in the sequel of life’s long arduous journey A preventable tragedy — Sarah, female This is one poem of many that were composed after interviews with people affected by the pandemic.
https://medium.com/live-your-life-on-purpose/our-world-needs-a-reset-button-or-maybe-a-rewrite-866be96c5048
['Brenda Mahler']
2020-12-27 21:08:34.286000+00:00
['Covid-19', 'Faces Of Covid', 'Reflections', 'Poetry', 'Coronavirus']
The Reason You Should Stop Chasing Your Dreams
The outside world is a projection of you. It is like a mirror that shows you what are you emanating with your thoughts, feelings and actions. But if you mirror yourself, then who you are? Who is emanating those waves that are reflecting in the mirror? Have you ever watched Matrix? I hope so. If not, I’ll explain an important part of the movie that gives you the answer. The protagonist, Neo, is in a white room with the Architect. He explains to Neo that he see himself that way because he was used to it. Remember also that Neo could immediately receive all the useful information from a computer and to learn anything he needed. Life is a game, someone says. But most important is that we are the main protagonists – we can always choose to change our identity, and to learn anything we want. If we stop emanating certain forms in the mirror, then we will see a different result. So, if we can emanate and change, this means that we are bigger than our identity, our appearance, our knowledge and belief system. Someone says we are pure consciousness, a part of divinity, love. I don’t want to give a description and to limit. To me, we are infinite. The infinite has no limits and no limitation. It is in everyone and everyone is in it. An Italian Romantic poet, Giacomo Leopardi, wrote a poem that me touched since the first moment I read it. “L’infinito” is its name: But sitting and gazing, endless spaces beyond it, and inhuman silences, and the deepest quiet I fake myself in my thoughts; where almost my heart scares. And as the wind I hear rustling through these trees, I, that infinite silence, to this voice keep comparing: and I feel the eternal, the dead seasons, the present, and living one, and the sound of her. So in this immensity drown my own thoughts: and sinking in this sea is sweet to me. By realising you truly are, you will stop asking for what you want to have. You have always wanted to be.
https://medium.com/illumination/the-reason-you-should-stop-chasing-your-dreams-e6d608c57961
['Yeva Volkova']
2020-12-29 01:42:11.646000+00:00
['Personal Growth', 'Inspiration', 'Motivation', 'Dreams', 'Life']
Netflix vs Apple: What Happened? And How Google Play Could Capitalize
Netflix vs Apple: What Happened? And How Google Play Could Capitalize Netflix has been Apple’s favorite for years and has made good money off of this relationship. However, since the beginning of this year, their relationship has taken a turn and now resembles a cold war. Netflix lost its place in the App Store’s list of featured apps and rank in their Top Grossing list as well. Apple didn’t fare well either, losing millions of dollars in commissions. Who won the battle? Let’s sort it out below. Background Apple has been singling out Netflix since the list of featured apps first appeared in iOS 11. As everyone knows, Apple editors are picky about the candidates they select to feature. But with the ability to receive 30% commission from popular service as popular as Netflix, why not make it even more popular? Case in point: In 2018, the Netflix app appeared 99 days out of 365 in collections from different countries for a total of 1961 times. By the end of 2018, Netflix became ​​the most mentioned application in the App Store. (Source) According to the stats, being featured this way helped the on-demand service earn $1.5B in 2018, and Apple received a justified third of these earnings as commission. Everything changed in January of this year. Netflix decided to save on Apple’s commission and began to transfer new users to subscriptions fees on the web instead of in-app purchases. According to our calculations, this change in Netflix’s strategy has caused Apple to lose at least $ 3.7–5.5M for Q1 2019. For all of 2019, these losses could reach $ 44–67M. A loss as significant as this could not go unnoticed. Netflix has quickly fallen into disfavor with the App Store editors. Netflix hasn’t been mentioned in their featuring list since February 24th. No longer listed in the featuring list, Netflix’s revenues have decreased. The number of mentions of Netflix in the iPhone featuring (depicted in the chart in light green) drops sharply in the autumn. Revenues (depicted in green) have fallen gradually since the beginning of 2019, in-app purchases were redirected to subscription fees on the web. Then, on March 25th, Tim Cook announced the launch of Apple’s own streaming service: Apple TV Plus, a direct competitor of Netflix. The new service is already supported by HBO, Showtime, and others, but notably not by Netflix. A month later, Netflix removed support for the popular app, AirPlay. They cited Apple support’s for new devices (Samsung, LG, and Sony will start working with AirPlay in 2019) as the cause (explaining that they would not be able to provide the proper level of support). Despite the ban by the App Store editors, Netflix still ranks #1 in the Entertainment category. However, without regular mentions in Featuring, the application has lost its position in Top Grossing. Netflix revenues (depicted here in yellow) fell shortly after the official announcement of the new streaming service from Apple. (Source) Did Netflix benefit from this battle with Apple? In short, yes. Let’s first look at a closer assessment of it. One of the indicators of a company’s success is an assessment of its share value. In January 2019, the first month after canceling payments through the App Store, Netflix shares on the New York NASDAQ rose 22% from $276 to $337. Does this mean investors saw greater value in Netflix? There are 2 important points here: 1) These values ​​are less than the historical maximum in July 2018, when the price reached $418. Therefore, this is not about continuing growth, but the fluctuation of the stock price within the local interval. 2) In December, the fall of the stock price (which continued from a peak in July) stopped. It’s quite possible that this news indicated improvement for Netflix’s investors. But, more likely, news about the company’s Q4 financial results and Q1 2019 plans played a greater role. In light of these financial indicators, the news about payments outside the App Store was not perceived negatively. Most likely, this news contributed to the overall positive perception of the company’s results and plans. Now on to Netflix’s reports on Q1 2019 results: According to Netflix reporting, the company earned about $4.5B in Q1 2019, 22% more than in Q1 2018. This is less than the growth for Q4 2018, which was 27%. The growth rate fell by 5 percentage points for the quarter. Could this be at least partly caused by the conflict with Apple? Let’s puzzle it out. At the beginning of 2018, quarterly growth was at 40% and then slowed down to 27%, (that is, about a third). It is more logical to attribute this slowdown to the saturation of the market and the exhaustion of light growth opportunities. Fortunately for us, Netflix provides an opportunity to look deeper at the statistics of the number of paid subscribers in its reporting, including a breakdown of the US / the rest of the world. And here is where the fun begins. In the local market in 2018, the number of Netflix paid subscribers increased by 10.7%, from $53 to $58.5 million; and for the 1st quarter of 2019 another 3% increase to $ 60.2M, which in annual terms is 12%. Compared with 2018, the growth of the 1st quarter of 2019 was higher! And what about the average bill? Did Netflix lower prices in order to remain competitive without the support of Apple? But everything is under control: even by the most modest estimates, the average bill rose by at least 10 cents to $11.5 (Netflix has not yet disclosed the exact data, but the numbers are expected to be higher). According to the tariff, this average bill means that the average paid subscriber in the USA watches video on 4 screens at the same time, which is suitable for the average family. With a population of approximately 330M and about 128M households, we can estimate that half of American households use Netflix. Of course, the other half still remains for an opportunity for growth, but, most likely, the easiest opportunities for growth have already been exhausted. Looking at not only the continuation of growth but also its acceleration in Q1 2019 we can see that Netflix’s conflict with the AppStore didn’t have much impact on performance. In the 2018 international market, paid subscriptions increased by 40%, from ~ $58 M to $81M, and in Q1 2019 to $88.6M, which is + 10% for the quarter and + 40% in annual terms! The average bill is also at the same level. Thus, in general, the impact of the fight with Apple wasn’t much at all and Netflix saved on commission fees. It is worth noting that Netflix itself is optimistic about the future. The revenue growth forecast for Q2 2019 is estimated at 26%, 4 percentage points more than growth in Q1 2019. If the forecast comes true, we can assume that for the service as a whole, growth will continue at just as fast pace as before. The New Hope While Apple and Netflix tried to sort this out, the latter became a frequent guest in the Google Play collections: the number of references to the service has increased 10 times since January 2019. However, it does not affect Netflix’s revenue. The number of mentions of Netflix (shown here in purple) has increased dramatically since January 2019. But Apple, in the absence of Netflix, has found new favorites to feature. Since the beginning of 2019, VSCO and Fortnite have become the most mentioned apps. In fact, when looking at the number of days they have been mentioned, VSCO and Fortnite have been featured almost twice as often as # 3 in the list: the Sleep Cycle application. (-Source) Apple is also preparing to launch Apple TV Plus, and the application page has already been optimized in the App Store. Source: iTunes
https://medium.com/hackernoon/netflix-vs-apple-what-happened-and-how-google-play-could-capitalize-1ea94defbce6
['Anatoly Sharifulin']
2019-06-29 23:11:43.572000+00:00
['Hackernoon Top Story', 'Netflix', 'Apple', 'App Store', 'App Store Optimization']
How to create stronger layouts with the 8pt Grid System
Originally published at marcandrew.me on September 3rd, 2020. Layout Grids are everything in UI Design. Well. Ok. Don’t discount Colour. Don’t forget about Typography. But get your Layout Grids right and you’re half-way there in creating much better looking, and functioning UIs. Consistent, and scalable spacing helps you eliminate guesswork whilst designing and developing. It requires fewer design decisions. And it enables a much faster turnaround on projects. The 8pt Grid is one of the most commonly used Layout Grid Systems there is, and for very good reason. Let me tell you more about it in this article…
https://uxdesign.cc/how-to-create-stronger-layouts-with-the-8pt-grid-system-cb0a1665714d
['Marc Andrew']
2020-09-05 13:21:41.137000+00:00
['UX Design', 'UI', 'Design', 'Graphic Design', 'Web Development']
Write code without if-else
Write code without if-else statement or art of avoiding if statement Not a long time ago I was looking for the job and once I was asked to create a function which returns true or false if the input is a number or not, but without using if and I did it. In the beginning, I thought it will be easy peasy lemon squeezy but then I realised it's not. This task made me think different. Seriously it's an exciting way to think out of the box and find new ways to solve old rusty problems that always was done in the same way. I'm not saying that old is bad or if statement is bad but if you see if in if in if it can be terrifying, unreadable, and confusing. if nesting make code complex and this can cause issues in future, the complexity of code measured by Cyclomatic complexity it's calculated by measuring the number of independent paths through a source code, and every time we using if statement we creating a new path. I prepared little examples that can be used in a job interview or can be reused anywhere. So let's look at some ways to keep away from if Example 1: Check if the value is number First, let's try to make a function isNum which I already talked about above (from my interview). It's pretty easy I think you should come up with something like this. This solution is really easy and straight forward. It can be lightened and made in one line and still be readable. Nothing to say about it. Example 2: Find a celebrity by name Now, let's have something more complicated. Let's make a function which takes a name and returns the name of the celebrity with the provided first name. In this example, you see how large and unpleasant becomes a simple function if it is overloaded with if statements (same with switch/case ). Example in betterSelector has many pros like readability, ease of expanding and supporting but have one serious flaw, that I have to point. If you provide in your selectorObj key that doesn't exist there, you will have undefined in the result and this can cause many unpredictable issues in your project. Always handle not provided values. Example 3: Filter odd numbers Next example is naive, we will filter all odd numbers and return in result array of even numbers This example is very easy and obvious, filter function just checking if something left after dividing on 2 if it is, then it's odd and must not be included to result. Example 4: Check if the object has nested value Sometimes we have deeply nested objects which can contain data that we need. In these cases, we use something similar to these examples. Here in getDataOld we just checking if the path exists in our current object, then we proceed and repeat until we get our target value or return false if one of the condition is not worked. Its really not reusable and scaleable solution but sometimes you can find something like this even in production code! To fix it I created other function, getDataNew it accepts 2 arguments, first is paths array — just string array which represents keys of an object which lead to nested target field in the object. Second is the object itself. In this function, I used for loop to go through fields of the object and saved the previous path in a variable with the same name and overwrite data with current path and eventually with the value of the target field. Summary. In the examples above, we see how it's easy to avoid if nesting and create expandable alternatives. If you don't want to create all this function you can always use some Matching pattern library from NPM. Remember there no such thing as irreplaceable if. But not all if`s are must be replaced. Did I convince you not to use if everywhere? What pros and cons do you see, please share with us. Anyway, I hope this article was useful to you. Thanks for reading.
https://medium.com/front-end-weekly/write-code-without-if-else-statement-or-art-of-avoiding-if-statement-4e44f0248c25
['Jhon Black']
2020-12-28 16:41:55.171000+00:00
['Technology', 'JavaScript', 'Software Development', 'Patterns', 'Programming']
Could Have Germany Built Nuclear Weaponry if Einstein Worked for Them?
Alternate History Could Have Germany Built Nuclear Weaponry if Einstein Worked for Them? And would have this lead to an axis victory of WWII? Albert Einstein and Adolf Hitler (Source: Wikimedia Commons) Albert Einstein has proven to be the brains of the 20th century, not only for the scientific and theoretical frameworks he came up with but also due to some of the decisions he made in the early period of the 20th century. One vital decision that was made by him was leaving Germany, as he knew that Hitler would force him to create weapons of mass destruction for him. Hitler knew that Einstein had the knowledge as well as the capabilities to build weapons that would put the Nazis at a technological advantage in case war was to break out. In order to give you a bit more context before we dwell on the alternate side of history, Albert Einstein actually migrated to America in 1933, six years before World War II broke out. However, he was clever enough to tell that Germany would start another World War, due to the German society simply becoming too indoctrinated by the Nazi regime on specific ideology and even worst on made-up German history as well as tradition. Although, the more direct reason for Einstein leaving Germany was because Hitler put him on the blacklist (assassination list), as Einstein did not want to cooperate nor work with the evil. If you want to read some more background on this please read this article I have written: A different kind of knowledge Now that we got a bit of background about the problem at hand, we can discuss not only Einstein’s ideology on war but also Hitler’s resourcefulness within his leadership. Yes, there is no other way to put it, Hitler was the reincarnation of evil, however, he was a pretty good leader for managing to not only build the biggest fascist political party but also to indoctrinate a whole country which we can argue was fairly well educated, making it more resistant to indoctrination. However, the art of persuasion does not look at intelligence, or does it? Adolf Hitler giving a speech at the terrace of Royal Castle of the Lustgarten of Berlin, during his election campaign, circa 1920 (Source: Time Magazine) Taking into consideration that Einstein was one of the brightest geniuses to ever live, he did not get tricked by the Nazi regime, however, this may not be necessarily because of his high intelligence, but more because of having a different perspective on life. The mentality of knowing that there is more to life, rather than spending a whole life to persuade a whole country that you are the best, why not spend a whole life actually becoming the best at something. Was Einstein able to build an atomic bomb? Now we need to get something straight, as I have mentioned in the article linked above, Einstein never created the atomic bomb, he only came up with nuclear fission which is the technology used to create atomic bombs. That is the main argument that we can have when it comes to the potential of the US Army creating the atomic bomb in 1945. Einstein writing the GR equations for the gravitational field in absence of matter (c. 1930) (Source: Wikimedia Commons) However, I would assume that given time, Einstein would have been able to create an atomic bomb. Nevertheless, we must remember that many of the great engineers at the time were still in Germany as many of them actually migrated thanks to Einstein so if he would have stayed and cooperated with the Nazis for whatever reason, they would have certainly done so too. From a generic point of view, there is no doubt that Germany was overall the leader of the world, at least during the Second World War. This meant that they probably had the potential to create even better Atomic bombs than the “Little Boy” and the “Fat Man” used by the US Army on Hiroshima and Nagasaki, Japan, in 1945. Therefore we can assume that Einstein’s nuclear fission technology was the critical part which was missing from the hand of German scientists. Maybe this technology could have been implemented into the V2 rocket system to create a ballistic atomic missile. Locker-Lampson and Albert Einstein 1933 (Source: Wikimedia Commons) From a different perspective, we also need to take into consideration that Einstein was actually a pacifist that would most likely rather die than produce anything for the Nazis, especially weapons of mass destruction. His creation of nuclear fission came about as an alternate source of power as well as for other scientific purposes, but never with the intent to weaponize this technology. At the same time we can bring into the equation the argument that without Einstein, we would have never probably seen the existence of nuclear weapons, nor nuclear technology. Just as mentioned in my previous work, I believe that Hitler would have broken down Einstein to the point where he would not have a choice but to cooperate with the Nazis, which as we have discussed would have lead to the creation of the atomic bomb before anyone else. This only brings another question into this interesting equation. Would have Germany won WWII using atomic bombs? This is probably the bigger question in this piece of alternate history and the more interesting one for sure. Now we need to firstly look at nuclear weaponry from two different perspectives. First, we have the normal perspective of a mass destruction weapon that has the potential to take tens of thousands of lives in a blink of an eye, being able to tear down whole armies in seconds. The second perspective is the psychological effect this weapon brings, not only seeing the power of said weapon but imagining the sheer fear it would put into armies, knowing that no matter how many soldiers the Red Army would have, they could be stoped in seconds. The other factor that we need to consider in this scenario is the time when this type of weaponry would have been built by Germany. If Hitler was to capture Einstein, it would have most likely happened in 1933–1934, therefore we could argue that by 1938 the Germans could have been in the possession of at least a prototype of the atomic bomb. Knowing Hitler, even how certain he was of victory during the first half of the Second World War, he would always come up with contingency plans, whilst inefficient this might have been one that could have turned the tide for the German Army. Fat Man Atomic bomb, 1945 (Source: Wikimedia Commons) If we consider that the atomic bomb would have been used at the beginning of the war, then the war would have quite different, as rather than using fascism to put fear into the enemy, they would just showcase the power of nuclear weapons to do so. Of course, there would most definitely be a limitation of atomic bombs or nuclear weapons that could have been built due to the lack of resources, however, by 1942 they should have had plenty of resources, including enough plutonium/uranium. Let’s assume that Hitler would be cleaver and wait until 1944 to actually use this overpowered technology as that is the year Germany was pushed back. The main target that I would imagine would either be Stalingrad or London, not only strategical targets but also a chance to showcase a different kind of power. However, thinking about it now, the western world was not extremely amazed by the Hiroshima and Nagasaki incident, so from a psychological point of view, I don’t think that the world would have been super frightened. However, this is where we have a big difference, nuclear weapons in the hands of the allies versus nuclear weapons in the hands of the axis, especially Nazi Germany. That could be the factor that would have made people really fear this sort of weaponry. Focusing back on the question at hand, if Hitler had enough atomic bombs, yes he would have been able to win the war, however, only two or three of these weapons — with my historical expertise of World War II — may have given the Germans more time, but I do not think that three nuclear bombs would have brought them victory. Let’s assume that they did take out most of Europe, including Great Britain and the Soviet unions, they would still have to face their biggest enemy, America. Here we are talking about a different kind of power, a nation that had the manpower as well as the necessary resources to put an end to World War II.
https://medium.com/history-of-yesterday/could-have-germany-built-nuclear-weaponry-if-einstein-worked-for-them-31796f92de27
['Andrei Tapalaga']
2020-11-05 10:59:01.452000+00:00
['Alternate History', 'History', 'Science', 'World War II', 'Politics']
American Hungers
“Since the influenza broke out here there has been quite an epidemic and lots of people are dying,” the letter reads: There is not so much difference between the death rate in France and right here in this camp. Every few minutes an orderly comes in here with a death report and a couple more go down the street with a soldier who has cashed in and probably his feet will be sticking out. If his case is interesting he is put away with those for post mortem and the next day, we get a number of slides of specimens all over his body. The author of the letter is my great-grandfather, who was writing his grandfather in October 1918, by some estimates the deadliest month in US history. (My dad, the family archivist, still has that letter — a yellowed page of flowing and faded script.) At the time, my great-grandfather was a soldier in Camp Taylor in Louisville, where he worked at a lab collecting sputum samples from flu-infected soldiers that doctors would analyze. Those grim rounds, as the letter indicates, put him on the front lines of a pandemic that was deadlier than the trench warfare still raging in northern France: the “Spanish flu,” so-called because Spain remained neutral in World War I and, unlike Great Britain or the US, didn’t censor reports of the outbreak. Coronavirus is devouring the US right now, but the Spanish flu was even more lethal. Close to 500 million people, or about one-third of the world’s population at the time, were infected with the flu, and 20–40 million people may have died from it. (By comparison, COVID-19 deaths worldwide have recently topped one million, with a far larger global population.) The contagion even caused life expectancy in the US in 1918 to drop 12 years, from 51 to 39. All told, deaths from that flu outnumbered all the military and civilian casualties of World War I combined. In many ways, the medical breakthroughs of the last hundred years have insulated us from the scythes of pain and illness, so it’s jarring to realize that a 21st-century plague is ripping through the US practically uncontained. So much has changed in our national life this year that I felt like I was in a dream-state and also somewhat unsurprised to stand in a CVS store this summer and hear this ad come over the speaker: “Because of the COVID-19 pandemic, it has been estimated that 54 million Americans, including 1 in 4 children, will go hungry this year.” If “hunger” in the US today seems anachronistic, that’s because it essentially has been: That spot I heard was an awareness ad by Feeding America, and their own language on this topic is not about hunger so much as “food insecurity.” Hunger is a complex subject, in part because it’s largely subjective. To measure it, you’d have to take blood and analyze nutrient levels. Plus, life-threatening conditions of malnutrition no longer occur in the US, at least not on par with what people in the Sahel countries of West Africa might still suffer from. Instead, our dietary diseases tend to be overconsumption mixed with underconsumption. Consequently, food insecurity is a more applicable metric in the US, since it calibrates a person’s ability to access food on a reliable basis. Food insecurity often leads to yo-yo eating, in which someone may skip meals until a paycheck arrives and then binge on cheap, calorie-dense foods in lieu of more nutritious fare. Which leads to the paradox of a country with high rates of obesity where, during 2009, in the pit of the Great Recession, more than 50 million US Americans were food insecure. That level of food insecurity, pre-pandemic, should come as no surprise to anyone paying attention to the drop in our living standards over the past decade — an odd trend, considering that those standards have risen almost everywhere else across the globe. Only the US, Brazil, and Hungary are worse off today than when the Social Progress Index was launched nearly ten years ago. Slipping from 19th place (out of 163 countries) in 2011 to 28th in 2020, the US currently ranks in the company of Cyprus, Singapore, and Malta. Our health services are comparable to what people receive in Jordan and Albania. We’re number 100 in the world in discrimination against minorities. The coronavirus has exacerbated these ills: Around 10 million people in the US reported losing their jobs in the second half of March. By early June, unemployment claims spiked to 40 million. Layoffs soared. Food prices shot up. Two months into the pandemic, one in six adults was food insecure. Perhaps the saddest stories to surface, though, might be the 14 million children who don’t have enough to eat. In May, photographer Brenda Ann Kenneally set out across the US to chronicle people on the edge of hunger. The photos she took are of families at prayer, neighbors taking meals to their friends, volunteers delivering food to the homeless. But they’re also images of want: Kids cradling bags of donated bread. Kids sitting on crates reaching into bags of Cheetos. Kids eating spaghetti on the floor of a homeless shelter. Among households with children, one in three has reported having insufficient food. According to Lauren Bauer at the Brookings Institution, that is “the highest level in nearly two decades the government has tracked hunger in America.” Studying the faces of these kids, it’s hard not to feel that our policies have failed them. Back in March, Congress passed legislation that expanded food stamps benefits and broadened nutrition assistance for kids and made cash payments to most households. Since then, though, talks for a new stimulus relief stalled in August, and the chance of a bipartisan deal before the election looks “dim.” Which is just too bad for people waiting in cars lined up bumper-to-bumper en route to food banks. Or workers earning under $14 an hour. Or anyone who has suffered an eviction or a vehicle repossession or the loss of healthcare since March. From Justinian’s Plague to the Black Death to the Spanish flu, poxes and pestilences have been recurrent features of human existence. But what seems appalling about COVID-19 in the US today is that our response to it has been cavalier and scattershot. A nation that can freeze debt and print money without restraint and produce a relief bill if our lawmakers simply hustle has instead bungled this recovery. Our president admits he downplayed the coronavirus. A Cornell study found that Trump is the single largest driver of misinformation about the pandemic. Trump, along with many others, also rushed to reopen the economy. The result? With just 4% of the world’s population, we have 25% of the world’s coronavirus cases. One consequence is food insecurity — one of the starkest examples of our lack of investment in our own people. Now it seems that the liabilities we’ve ignored for generations have come due. For years we saw month-on-month gains in the stock market and assumed everyone must be doing better. We believed there was no need to underwrite affordable housing or healthcare or a higher minimum wage. And now time has found us out. These days, in the middle of another dark October, I’m inclined to think we have had such a high standard of living that we’ve lost any frame of reference for the great catastrophes of our history, such as the Spanish flu or the Great Depression. And yet the nation feels as if it’s quivering on the edge of mass unrest. So many ingredients for violence are simmering: the pandemic, an election that already feels compromised, the anger of a nation polarized along party lines. These are hallmarks of social decay. Physical and social hungers, persisting too long, lead to chaos, nationalism, democracy dying in daylight. Witness the circuslike squabbling of the first presidential debate. Trump’s attack style approximates (or shapes) our public discourse. The way he yelled over Biden and Wallace alike gave him a weirdly voracious aspect, as though the maw of his contempt could swallow everything before him — all the crowds, all the air, all the country itself. But the pandemic cannot be yelled away. The pandemic serves no agenda. The pandemic simply eats us. We are all its prey, even Trump, and the metrics and forecasts we’ve set for how long it’ll run its course or how many people it’ll kill all turn out to be Pollyannish. Seeing the death toll climb month-on-month, marching upward just like the stock market, it’s also hard not to wonder: Are we even trying to live up to our aspirations for democracy? Consider just a few of the darker anomalies about the US: We’re the only country to withdraw from the Paris Agreement. We have, by far, the highest prison population rate in the world. And — along with Cuba, Comoros, and Palau — we have signed but never ratified the Covenant on Economic, Social and Cultural Rights, which the UN adopted a half-century ago in the dream of guaranteeing people across the world the rights to housing, clothing, water, and food. Every administration for the past 50-some years, Democrat and Republican alike, has refused to honor this treaty. Instead, in roughly the same timeframe, we made the decision to set out in the opposite direction: suctioning $2.5 trillion each year from the bottom 90% of workers into the hands of the ultrarich. What else can we conclude, except that our legislators — and the vested interests that often dictate their decisions — regard workaday people as sores on the body politic? In an earlier post this year, I argued that the coronavirus has laid bare how the US only values people who can already afford to ignore disaster. Now it seems the situation is even worse than that. A plague, one of the nightmares of history, has returned, and we are looking upon ourselves for what we are: A country ravenous for profit, feeding on its own people. Special thanks to Alisha Wheatley and Nick Otten for insights and concepting and Matthew Zoeller for art direction.
https://wolfordcharles.medium.com/american-hungers-84b84a23dba5
['Charles Wolford']
2020-10-28 14:45:44.746000+00:00
['Food Insecurity', 'Elections', 'United States', 'Coronavirus', 'Hunger']
Death By Overdose or By Ventilator? People Seeking Recovery From Addiction Face Second Epidemic
Death By Overdose or By Ventilator? People Seeking Recovery From Addiction Face Second Epidemic Massive Budget Cuts to Addiction Health Services Risks 27 Million American Lives Imagine every ventilator in the United States being switched off, all at once. That’s what leaders are doing to recovery funding for addiction services in America — and they count on us dying before anyone notices. Before coronavirus, America already faced record numbers of deaths from a different epidemic. Over 350 people died every day from an illness just as deadly as COVID-19. This disease, which is more widespread than any flu, hides in plain sight in one third of all American households. It affects 27 million people, every day. It’s substance use disorder, and it’s killing Americans at record rates. In spite of these overwhelming numbers, policy makers cannibalize recovery funding to plug holes in our fractured health system. The stigma of addiction ensures that public servants feel justified leaving “addicts” to die, while earmarking funds that are meant for our community for coronavirus instead. I have spent months advocating and contacting several behavioral health and public health leaders and every one of them told me their hands are tied: any money for healthcare is dedicated to COVID-19 now. Nearly 71,000 Americans died of drug overdoses in 2019, and experts all agree that number will continue to rise in 2020. New data shows that the brief improvement in survival rates has been reversed, and now we are hitting a new record. This resurgence of substance related deaths coincides with the COVID-19 crisis. That’s no coincidence: the coronavirus pandemic is driving those death rates up, and the White House and many experts believe it’s only going to get worse. I have spent months advocating and contacting several behavioral health and public health leaders and every one of them told me their hands are tied: any money for healthcare is dedicated to COVID-19 now. Yet, where was this sense of urgency two years ago, or five years ago, or ten? Recovery advocates and harm reduction specialists made recommendations back then that would save billions of dollars and millions of lives now, whether a person is facing COVID or cocaine addiction. At the moment of truth, the recovery community is failed once again, by a system that is designed not to save us, but to sacrifice us because our lives always come second. On top of the double danger of untreated addiction and coronavirus, behavioral health services are the first on the chopping block as leaders take advantage of the COVID-19 crisis to steal from programs that save lives. Oregon, which is already last in the nation for access to recovery services, plans to slash $69 million from its 2021 budget. Colorado cut $26 million from its budget. Minnesota is reducing services in proportion to reduced payouts from Big Pharma. Many states are using COVID-19 as an excuse to strip severely underfunded services from people who were already on a razor’s edge. The result? More overdoses, fewer recoveries, and an annual death toll that is higher than the entire Vietnam War. They’ve decided that their priority should be robbing our most vulnerable populations, destroying hard fought gains that are beginning to show real progress on the front lines of the addiction crisis. Recovery advocates have pushed for years to get substance use disorder recognized by insurance companies, lawmakers, and healthcare providers. Yet, as soon as COVID-19 hit the United States, these experts and decision makers have thrown up their hands and decided they would rather let people die than do their jobs. They’ve decided that their priority should be robbing our most vulnerable populations, destroying hard fought gains that are beginning to show real progress on the front lines of the addiction crisis. Opioid related funding, which the federal government only released within the last few years, contributed to the small dip in death rates, but now we are right back where we started — and it’s getting worse every day. Addiction is a mental health disorder that is characterized by recurring substance use, feelings of hopelessness, social isolation, and obsessive thinking. It’s in the same family as mental health issues like bipolar disorder and obsessive compulsive disorder. Remission is possible, but requires intense, community based treatment and support. Many of us who have been healthy for years are now facing new challenges. COVID-19 precautions make it impossible to meet in person; the stress of quarantine intensifies mental health issues; and isolation compounds a potentially lethal feeling of hopelessness. Addiction related deaths include overdoses, suicides, accidents, and what are known as “deaths of despair.” Call it what you want, but these deaths have lowered the life expectancy for all Americans, regardless of race, class, and income. And lawmakers seem to have no problem bleeding millions of families, communities, and neighborhoods of the resources that are keeping them on the right side of the dirt. We are back to trying to help one another, and the death toll will reflect the severe limitations we face when we’re deprived of funding for housing, treatment, support services, and access to care. No matter where you look, recovery is being bled dry — without a plan, clear outcomes, or even protocols everyone can agree on. According to STAT, “Medicaid supports 21% of the country’s spending on substance use disorder programs. Though the federal government earmarked $425 million for behavioral health in its emergency coronavirus relief package, experts say the help won’t come close to filling the hole.” That means that the government is putting all its chips on COVID. The rest of us? We are back to trying to help one another, and the death toll will reflect the severe limitations we face when we’re deprived of funding for housing, treatment, support services, and access to care. People in and seeking recovery were already fighting for our lives in the national addiction epidemic, before coronavirus came along. Make no mistake: behavioral health issues, such as addiction, have a long term impact on the American economy and way of life, maybe even longer than COVID-19. People in and seeking recovery were already fighting for our lives in the national addiction epidemic, before coronavirus came along. Now, we face a war on two fronts. We face death on a ventilator or death by overdose, because leaders can’t keep their sticky fingers out of funding earmarked for recovery. If we don’t change that immediately and ensure that every possible measure is taken to keep people healthy, safe, and well during these double pandemics — we’ll all end up on life support. Ryan Hampton is a person in recovery from addiction and author of “American Fix: Inside the Opioid Addiction Crisis — and How to End It,” published by St. Martin’s Press. He’s a nationally recognized activist and organizing director at the Recovery Advocacy Project.
https://medium.com/an-injustice/death-by-overdose-or-by-ventilator-people-seeking-recovery-from-addiction-face-second-epidemic-4dea9c1d41c9
['Ryan Hampton']
2020-07-26 03:20:43.903000+00:00
['Mental Health', 'Addiction', 'Healthcare', 'Politics', 'Covid 19']
The Virago Newsletter for November 20th, 2020
Here’s your chance to catch up on the articles for this week: Confessions of a Former Spearmint Rhino Lap Dancer “Gentlemen’s clubs like this have a business model built on exploitation — there, I said it. Clubs are over-filled with girls, sometimes more girls than customers, and each girl is paying a fee to be there. We are self-employed and losing money in the form of ‘house fees’ before our shift even starts.” 4 Ways We Use Kindness as Currency and Why It’s Dangerous “In our society, kindness is a commodity. Acts of goodwill are often weighed down by conditions, including reciprocity. Receiving kindness bears responsibility beyond just paying it forward: nowadays, kindness often comes with expectation.” My Path from Party Girl to Homebody “Having spent much of my young adult life plagued by the ravages of love and sex addiction, I am acutely conscious of the blessings reaped from currently sharing a loving life partnership.” Thanks, But No Thanks “The opportunity to meet people and talk to other humans was, I have to admit, a big draw to the job with the insulting pay. The job was only 18 hours per week. It’s close enough I could walk or ride my bike over. There were several attractions beyond the pitiful pay.” This is What Happens When You Look for Inspiration in Uncommon Places “Change is a disruption. We resist change because we fear it. That resistance is the same as inaction, and if we don’t push past it, we stay stuck. We miss out on discovering something better.” Bipolar Disorder Makes My World Smaller “I didn’t feel like I was being a good mother, which I wanted to be more than anything. Trying to self-medicate with alcohol and drugs with my new boyfriend didn’t help either.” It’s Okay if You Don’t Change Your Name “I feel your name is the most important part of your identity. If I had changed my name, every time I signed, I would feel I am under him or belonged to him in the literal sense.” 8 Myths of Domestic Abuse “Abusers make conscious decisions on who, how, and when they decide to abuse. Drinking or other habits can exacerbate violence, but it’s not the cause of the abuse, and is only an excuse for their irreprehensible behavior.”
https://medium.com/the-virago/the-virago-newsletter-for-november-20th-2020-9d8836d1b6e1
['Michelle Jaqua']
2020-11-20 15:11:56.078000+00:00
['Newsletter', 'The Virago', 'Feminism', 'Women', 'Writing']
How Will I Know if I Need a Camera Upgrade?
Autofocus and Exposure Problems Another good reason to consider upgrading or not is autofocus. I remember when I first started shooting, I went from an introductory camera with 7 autofocus points to one with 45. And now, I shoot mirrorless, so I can essentially put my autofocus point anywhere. Anyways, if you’re finding that a lot of your key shots are missing focus and this is costing you otherwise very good photos, it might also be an indication to upgrade. It might also be a wise upgrade into the world of mirrorless due to things like superb subject tracking and even the ability to follow people’s eyes. But keep in mind, problems with autofocus may also be related to your lens, not your body. If you’re missing the correct exposure often, say, because you shoot on a DSLR, it might also be time for an upgrade, again, to mirrorless as the EVF’s preview is very good at showing you your exposure. This means that you’ll hardly ever miss exposure again. But at the same time, if your autofocus is hitting often enough to get the shot and you’re pretty good at getting your exposure right, in this department, their may not be much reason to upgrade. Better Dynamic Range and Lowlight Performance If your camera sensor, in terms of dynamic range and or lowlight performance, is not performing up to par of what you need, this could also be a superb reason to upgrade. A good upgrade path in order to increase dynamic range and lowlight performance is to go from APS-C to full-frame where the larger sensor means cleaner images due to the larger area. I find that going from APS-C to full-frame increased my low light performance by an entire stop, which is huge at an ISO higher than say 1600 or 3200. Dynamic range is also a very good reason to upgrade as well. Even new full-frame cameras have a better dynamic range than older full-frame cameras, so this is not just based on sensor size. Although, a bigger sensor does typically give better DR. If you shoot landscape and you find that your current camera has poor highlight or shadow recovery, then yes, an upgrade to a newer camera is justified. However, for someone like me, who shoots portraits in more or less even lighting, the dynamic range has hardly been a concern for me. If your work is not requiring you to process photos that have harsh shadows and harsh highlights, this reason to upgrade is not really applicable. Access to Professional Features Sometimes you upgrade your camera just for more professional features. Maybe the joystick is a convenient tool to have to change focus points. Perhaps a top-down LCD screen for viewing your settings on the go is something you like. Maybe you’re dying for a better battery life. These types of features are usually available on more professionally-oriented cameras, but I typically wouldn’t recommend upgrading based solely on small things like this. However, if you are upgrading for a professional feature such as dual card slots, I would support upgrading for this alone. I have never had a card fail on me yet, and I sincerely hope it never happens. But especially for client work, having a second card slot is serious peace of mind. While I find the other reasons I mentioned for potentially upgrading as important, getting a new camera body for dual card slots is justifiable in many instances on its own. Wear and Tear Finally, if your camera has been around a few years, and it is having issues with its shutter or maybe the rubber areas are falling off, then it might be time to upgrade here. But Keep in Mind… Keep in mind that a camera upgrade also doesn’t mean buying the latest and greatest camera out there. It could also mean upgrading to a camera that’s a generation old, for example. I’ll also add in that if your camera is getting the work done and is getting it done in the quality that you want it, then I would say to reconsider an upgrade. But if you find that your camera tool is limiting you from doing better work or at least better work more often, than an upgrade makes sense. Remember, your camera is a tool, and it won’t magically make your photos spectacular. The point of your gear is to provide you with as little restrictions to your creative flow and expression. You know you have good gear when it allows very little barrier for you to create art. If you feel like your camera body is a barrier, then it may be time for an upgrade.
https://medium.com/photo-paradox/how-will-i-know-if-i-need-a-camera-upgrade-1b21ccfe9052
['Paulo Makalinao']
2020-10-12 19:13:57.796000+00:00
['Creativity', 'Photography', 'Art', 'Cameras', 'Tech']
Should Manchester United Be Looking For A New Manager?
Gameweek 13 was a tough one for some of the Premier League’s most under pressure managers, and that includes Ole Gunnar Solskjaer. Manchester United found themselves 2–0 down after 70 minutes at Sheffield United where they were, to be completely honest, awful. However, three quick goals in the space of nine minutes saw them turn the game around, and for a moment it seemed they might actually get out of there with the three points. That wasn’t to be though as Sheffield United found an equaliser which saw them share the points at a final score of 3–3. It was the least they deserved and it also prevented a frantic ten minutes from masking another disappointing Man United performance. Two more dropped points leaves Solskjaer’s United back in 9th place, already 9 points down from the top four. Recently there has been a little more optimism from some regarding Man United, mainly down to a couple of Premier League wins and a League Cup victory at Stamford Bridge. If you actually look at their Premier League form though, they have just those two wins in their last eight games with three draws and three defeats making up the other six. For some more context, they are just as close to the bottom of the table as they are to the Champions League places. Also the signs of this struggle were there as we came down the stretch last season. In the final nine gameweeks of the 2018/19 season, they won only 8 points with a goal difference of -9. That’s less points than relegated Fulham and Cardiff managed in the same period. Combining that with their record so far from the Premier League in 2019/20, they have 6 wins, 7 draws and 9 defeats, leaving them with a total of 25 points from 22 games. At this point we’re working with a fairly large sample size of more than half a season’s worth of games averaging not much more than a point per game. Despite all of this, Ole Gunnar Solskjaer isn’t getting as much criticism as you might expect. Consider how much heat Unai Emery is getting for delivering similar results at Arsenal while Mauricio Pochettino has already been fired by Tottenham for a slightly better record, and that’s despite five years of success which includes a Champions League final just a few months ago. So what are the circumstances shielding Ole from that kind of pressure? Firstly, there is the record of the past few managers at Manchester United since Sir Alex Ferguson’s departure. David Moyes, Louis van Gaal and Jose Mourinho all failed to achieve the success expected of them, which in turn has lead to the fans’ anger being redirected higher up in the club structure. Obviously a lot of that criticism is warranted as United’s very ineffective recruitment strategy has hamstrung all of those coaches to an extent. With that being said, this is not a situation where blame must lie only with one or the other. Man United can be in need of different personnel in charge of their transfer policy while also needing a new manager. On top of that, Solskjaer was involved in the transfer process during the summer, he was one of the main figures pushing for Harry Maguire to be brought in despite the huge transfer fee. Even if he wasn’t though, their form in the past 7 months or so cannot be justified by the state of their squad, that is unless you really believe it is at the standard of a team near the bottom of the table. Another issue, which kind of stems from the lack of success of previous managers, is the idea that firing the previous coaches didn’t bring about great results so there’s no point doing that again now. Instead, the theory goes, that regardless of the results Ole should be given a couple of years to build his team because that is what’s required in order to assess his work properly. Well before even going any further, we’re approaching one year since Solskjaer took over and I think it’s hard to say there’s been any meaningful improvement. This means the argument that other coaches, Guardiola and Klopp are used regularly as the examples, took time to build their teams so Ole deserves that too, basically falls flat. Even before Pep and Klopp had the right players to get their teams functioning as they would like, there were signs of improvement and a clear direction of where the team was going. I’m not sure you get the same impression from Solskjaer’s Manchester United team. Of course the other flaw in this argument is that Pep and Klopp were established top tier coaches. That is again not the case for Solskjaer. Aside from the fact that he is a Manchester United legend who knows the club, what are his credentials for being given a few transfers windows to build his own team with no questions asked? Surely if you’re going to be patient with a coach and decide you’re going to give him time to build a project capable of bringing long term success, you have to be sure you have the right guy? Personally, I just don’t see Solskjaer as that person for Man United. Solskjaer being a club legend brings us to the final reason that he doesn’t face the kind of scrutiny other managers in his situation do. His place in the club’s history buys him more time and more sympathy, not only from the supporters, but also from the club. To see this you only need to look at the difference between the United fans’ treatment of Solskjaer and Emery’s relationship with the Arsenal support, despite similar results. Now this one isn’t really something I’m disputing, of course someone with a strong relationship with the club gets more favourable treatment. It is definitely worth mentioning though, because it plays a huge role in the leniency he receives from sections of the Man United supporters, and from parts of the media. So what do I think Manchester United should do next? Well at the least they should be seriously considering whether Solskjaer should be in charge next season. Even if they don’t want to pull the trigger during the current campaign, a continued absence of notable improvement should force them to start looking for a new coach to take over for the 2020/21 season. Should their Premier League form continue as it is now, then I think they may even have to take action before then. There’s only so long you can make excuses for Ole with their last 22 league games representing the kind of form you’d expect from a team around 15th. This isn’t to necessarily say he should be fired if they fail to win next week, but at what point do you start holding your coach accountable for these results? Normally I think clubs should try and wait further into the season, ideally the end of the season, before changing their manager, particularly when they’re in a situation like Man United. Their objective of a top 4 finish is in all likelihood beyond them, and they’re not exactly going to get relegated, which means there is no emergency pushing them to make a change. However, the coaches available on the market make this a more interesting situation. Mauricio Pochettino is the main name mentioned as he was linked with the job last season and the general consensus is that he would be a good fit at Old Trafford. While there are arguments his appointment would end the same as his predecessors because of the problems higher up at Manchester United, we have already seen what Poch can do with a club that doesn’t have the best recruitment, albeit for different reasons. So, with Pochettino now available, United are left with a clear target to pursue should they want to move on from Solskjaer. That may well increase the pressure on their current manager, as he’ll need to deliver some better results in their upcoming schedule. Next up is Aston Villa at Old Trafford, although that is followed by game against Spurs, Man City and Everton. It’s likely to be a make or break period for Solskjaer.
https://jackmcc98.medium.com/should-manchester-united-be-looking-for-a-new-manager-7309bc5000d1
['Jack Mccutcheon']
2019-11-28 15:09:15.824000+00:00
['Premier League', 'Football', 'Manchester United', 'Sports', 'Soccer']
Just Do The Work
At 30, I was full of beans. I wanted to know how to be successful, and I knew there had to be someone out there who had the answer. So I pursued that person relentlessly. I pushed hard to find what I thought I didn’t have, and in that, I forgot about the only thing that mattered — The Work. Fifteen years down the road, as I browse the articles here on Medium, it becomes quite apparent that many people are just like me; seeking answers to problems. Now, that’s all fine, but the elephant in the room here is that while we’re looking for answers to problems, we’re focused on the problem. We are immersed in our lack of success, of skill, ability, whatever – you name it. So what’s the problem? The problem is we have a linear concept of life and work, and in that, we believe that there is a future towards which we must toil. We believe that in this future, the place where we are not yet, there is a better version of us waiting for us to arrive, but only if we make the right decisions and do the right things. We are so caught up in solving problems, in the possible future outcomes of our action or non-action, that we forget about the work. People, the work is the only thing upon which we need to focus. Everything else is a distraction, or at best, B grade activities that do not of themselves bring about change, cash in your pocket, or success, whatever way you may define it. Consistency Is Key Now, I’m not telling you something you didn’t already know here. You’ve likely read of this maxim of creativity before now but it bears repeating. Turning up day after day, engaging in the work you decided to do is of primary importance, for the only reason that it should be important; the inherent enjoyment you extract from doing it. When you and I enjoy the work we do, it sustains us. In the absence of applause and positive feedback, the work itself is the only thing that feeds the creative soul and therefore, should be first and foremost, the reason you get out of bed every day. Go to work. Make your stuff. Create yourself daily through the work you do, because you see when you do, you stand yourself apart from millions who don’t. The truth of the matter is that most people dread work. Jesus Christ! What a shitty way to live! That’s not living, that’s just about surviving. Annie Dillard, in her book The Writing Life, wrote; “How we spend our days, of course, is how we spend our lives” Page 32 of The Writing Life by Annie Dillard She was writing of the merit in schedules for helping us keep chaos at bay. And that’s what happens when we are not acting on purpose daily; we’re torn asunder by the chaos of our indecisiveness. No One Else Has The Answers I’ll be honest with you… I used to question myself a lot. I still do but not as much as before. These days I occasionally wonder if what I’m doing is going to be fruitful, but then I remember that this is all I have, and whatever I’m doing is right. The shift was gradual. One typical kind of day driving down the M50 motorway it dawned on me that time is an illusion. The future was a fallacy, and all that exists is now. At that moment, the ever-present anxiety that came with the pursuit of success seemed to dissipate. I still needed time to detox and ween myself off the old concept, but something changed that day. Today, the feeling that I am doing something I shouldn’t, or that I’m not doing something I should, is, I am happy to say, gone. I realise that nobody else has the answers to my questions. No one has lived me before; there is no manual for Larry Maguire. So I believe we have no choice. We must immerse ourselves in our chosen work for no other reason, at least primarily, that we are curious and engaged. Step into the fire and see what happens Forget about how you’re going to make it and just do the work.
https://medium.com/the-reflectionist/just-do-the-work-556a2567e2ef
['Larry G. Maguire']
2019-07-29 20:03:37.142000+00:00
['Life Lessons', 'Work', 'Art', 'Daily Thoughts', 'Creativity']
99.5% of all People in the World are Financially Illiterate
Becoming wealthy has nothing to do with luck or making a fortune. Most of the millionaires just learned to follow these principles daily. Instead of complaining and ranting, 99,5% of society chooses to. We are living in the best times ever to make money. The internet has opened up so many opportunities. They are there, and everybody can grab them. But it doesn’t matter whether you have 500 EUR monthly income or 50,000 EUR. The principles you have to apply are the same. These are the principles that I learned at the commercial School and helped me through times when I almost made no money. Without getting any financial support from my parents or society, I was able to get a master’s degree in economy, some other degrees on top of that (martial arts, communication, and psychology) and build a bit of wealth for myself. What are these five significant mistakes the poor make? How could they overcome these problems and become as rich as f’)%? You have to consume The pandemic has changed a lot. The economy was shut down. And two months later, it was re-opened. Immediately politicians mentioned “the brave citizen’s duty to consume”- to save the economy. I never understand why people are so proud of possessing many things. Many of the “eat the rich” complainers I know have tons of clothes, two or three cards, and the little house in the prairie on debt. Why do people value consumables so much that they spent all the money they make in a month on things they mostly don’t need? How to overcome this problem? Learn how to budget One habit I learned in the commercial School was putting together a budget. It doesn’t matter at which income level you are. When I was a student, I lived by the same principle as today. The budget didn’t change. I just put it as a percentage of my total income and fixed it. How does that look like? 50% in needs I put 50% of my income into paying rent, buying clothes, food, and beverage. Everything to keep my body running. Basic needs. When I was making less than 1,000 EUR, it meant that less than 500 EUR was available for those needs. When I was a student, of course, I wanted the 200 sqm apartment, the nice Audi A6, the trip to the Maldives. Like my other colleagues who choose to jump directly into work-life while I was spending my night hours. But with only 500 EUR available, I didn’t want to afford that. Either I had to accept my situation or stay broke forever. The other 50% are available for other activities. A regular job is not for me — I deserve better. Some might say, well, this guy had a 1,000 EUR while he was at the university. He was well off. 1,000 EUR back then is about 1,600 EUR in today’s purchasing power. Why does he try to lecture the poor that merely make a living? Well, my main goal in my 20s was to get this damn master’s degree. It was my vision of “opening the door to a rich life.” Unfortunately, nobody gave me money — for various reasons. But instead of giving up and just joining the army of complainers, I needed to raise money to get this degree. Without an income, a student loan was not available in Austria. And this was great luck for me as it helped me to create the right money patterns. So what was the alternative? I picked up any stupid job I could find. I was put down and humiliated very often by people who I served with my labor. But, I had the goal, and this was necessary to go through. In my mind, it was just a short phase to overcome. And this exactly is the second problem I see in many people. They do not want to work. And I can imagine those people saying, “He doesn’t know what he speaks about. It is so difficult to find a job these days. Back in the 90s, eMail was not accepted as a means of communication. When I wanted to find a job, I had to write a proper letter on the computer, print it out, put it in an envelop and pay 1 EUR to get the letter delivered to the recipient. Applying for a job was hard work and did cost a lot of money. For every 100 letters I wrote, I got one paid job. Sometimes I had two or three jobs at the same time. At the same time, I was studying through the night hours. But it trained me well for the coming years. Financial success evolves from being of service to others. Saving money is useless — the inflation eats it all. One of the principles I learned in commercial School was: Always make a profit of 20–30%. For private people, it means: Put away 20–30% of your monthly income. Period. This is a simple and powerful rule. But the excuses are manifold. “I do not make so much money. You do not understand my situation.” “I have so many payments to make. I cannot save money.” or my favorite one. “Saving is for losers. Inflation eats it all.” Or lately “The interest rate is zero. Why should I save money?”. When I think 20–30 years back, I worked my butt off and studied every night. And yet, I put 20% of my income away. Let’s stay in the 1,000 EUR per month example. It means putting aside 200 EUR every single month. On the contrary, it leaves 800 EUR for everything else. And this is the number one principle every successful person or company follows. Know your income, adjust your expenses, and factor in 20–30% profits. The 200 EUR goes onto a savings account. Nothing sophisticated. They are just building an emergency fund. I was delighted to have that available in March 2020. Very often in my life, a job loss or similar tragedies didn’t make me panic, as I knew I had built my safety net of 3–6 months’ expenses in a savings account. This was the time that was enough to find a new job or rebuild the business. Save to invest The fourth principle. Once the emergency fund is filled up, money is available for investing. Well, many people fall back into the “spent everything” pattern. Smart ones re-allocate the available resources to something that has the potential to build another income stream. For all those who work themselves out of an employment position, I particularly find investing in the stock market an attractive next move. Why? today in 2020, it is a simple process with all the online brokers around The cost of allocating capital is relatively low. In case anything goes south and the emergency fund is eaten up as well as all financial income stream goes bust — money is quickly available. These three reasons were the main ones that made me move into the stock market in the 90s. Well, it was not that easy to invest back then as it is today. ETFs were not available, and online brokers almost didn’t exist. But, in Europe, investment funds for ordinary people like I was were available. The first ones started with monthly installment plans. Little did I know, so I chose two plans and allocated 20–30% of my income into these plans. Monthly. Most of the time, I wouldn’t say I liked it. While my friends got their new cars and told tales of dream vacations, I was working every day and studying in the night hours, weekends, and every single holiday. Due to the savings plans, no money was left at the end of the month for luxury. Now 20 years later, it was worth it. But this is a different story. Read, read, read At the end of School we had a huge celebration. Everybody was happy to get their final exams. Many of my friends said: “School is over. I am done with learning.” I never had this urge to quit learning. Life is so exciting. There are so many exciting topics in the world. I always have to stop myself from becoming a Jack of all Trades. I am interested in everything. The thing is, never stop learning new things. Never stop increasing your expertise. Always go for the next seminar, webinar, or college degree. Warren Buffett has made this one principle of his successes. In interviews, he said that he spent all day reading annual and quarterly reports — nearly 500 pages per day. Bill Gates is known as an avid reader. Rumour has it that he reads a minimum of one book per week. There are so many successful people in the world who say that the basis for their success is learning.
https://medium.com/the-innovation/99-5-of-all-people-in-the-world-are-financially-illiterate-8bd39139b882
['Christian Soschner']
2020-11-08 14:31:21.299000+00:00
['Literacy', 'Money', 'Financial Services', 'Success', 'Entrepreneurship']
Appalachia — without the classism and caricatures
By Sana Saeed The Idea “OK, let’s start thinking of a series idea — something about the environment, the energy industry. Use DAPL as your springboard.” This was the mandate my team of four AJ+ producers was given in November, some days after the U.S. presidential election. We not only had jobs to do, but now, more than ever before in our young careers, we had to become harbingers of accountability and sanity in a time in this country — and in this world — when more and more people arguably had little of either. The Standing Rock protest against the Dakota Access pipeline (DAPL) was in its seventh month, and we had covered it extensively. The protests were a reminder not only of the incredible resilience of the indigenous communities of this continent, but also how the energy industry just isn’t interested in the way its policies impact the lives of everyday people. We wanted to tell those stories — the ones about the Americans at the center of the energy industry. Our discussions led to the topic of the coal industry and the Americans who live in Appalachia. The conversations followed the same predictable route. “Appalachia is the poorest region in the country.” “It went to Trump — even some places that were for Obama in ’08 and ’12! How?!” “The accents will be interesting!” Our rhetoric was your typical, bubble-induced, classist talk that, while lacking ill intention, reduced the experience of Appalachians to only the reductive images and tropes we, ourselves, had been exposed to. When we thought of Appalachia, we saw poverty, Trump and cultural caricatures. Kentucky Mine Supply Company in Harlan. We should have known better. We all come from different socioeconomic backgrounds and experiences; my own family lives in a rural town in Canada dominated by the energy industry. But that’s the thing about ideas and perspectives: You can hold some that are contradictory and not realize that they are, in fact, contradictory. Our weeklong trip to eastern Kentucky — the site of the coal industry’s former glory, and now its graveyard — would leave us with a better awareness of the misrepresentations of Appalachia, as well as the coal industry’s deep impact across the region. The Arrival Getting to Appalachia isn’t easy when you’re coming from warm, sunny, dietary-restrictions-friendly San Francisco. We knew the weather would be cold (the lowest temperatures we experienced were about 9 degrees Fahrenheit), and the two of us whose tummies weren’t of U.S. coal and steel grade knew that food was going to be an issue. Before booking our flights, we had already scoped out potential eating spots — and found mostly fast food options. Our senior producer Maggie and editor and cameraman Michael are ready to board. Three of the four team members took the trip: Maggie, Michael and myself, Sana. We flew from San Francisco to Knoxville, Tennessee, and from there, drove two and a half hours to Harlan County, the infamous site of a bloody, fierce, pro-union miner strike in the 1970s. Harlan County isn’t unknown in American pop culture. It was featured in Barbara Kopple’s Academy Award–winning 1976 documentary, “Harlan County, USA,” about the protest movement. It was also the setting of the popular FX show “Justified” — which doesn’t exactly leave you with the most positive and glowing image of the area. To me, though, it looked like the small, rural, tar sands Albertan town where my parents live. It was cold, gray and surrounded by natural beauty; there were few people on the streets, but lots of trucks and semis. We went in hoping to tell the story of the coal industry and its impact on the region without falling back on familiar tropes of Appalachian poverty and despair. The People We met and spoke with some incredible, intelligent and warm Appalachians who were all, despite their ages and history, actively involved in reviving and preserving their culture, environment and community. There was Carl, a retired coal miner and Marine Corps vet who could tell you stories for days. (I think he actually might have.) After years of working for labor rights in the United Mine Workers of America, Carl is dedicated to promoting sustainable and renewable energy alternatives. Carl grew up in Benham, one of several towns that was built specifically by coal companies for workers. We sit with Carl in the town of Benham, where he was born and raised. We also dined at Daryl and Brad’s Heritage Kitchen in Whitesburg. Daryl and Brad, who are a couple, left their corporate jobs in Lexington to open a restaurant in rural Kentucky. They are trying to preserve classic Appalachian cuisine while also promoting local, healthy eating. Maggie talks to Darryl and Brad in their restaurant, Heritage Kitchen, in Whitesburg. In Harlan, we sat in awe and inspiration of Kim and Roy, another couple who own a small business. Despite having the opportunity to leave Harlan (and they did for a time), both have made a conscious effort to center themselves where their families had lived for generations. Having experienced extreme poverty, their goal is to help cultivate jobs and a diverse economy through their own small welding business. Kim and Roy talk to us about their struggles and their growth. There was Destiny, a fiery filmmaker who wanted nothing more than to dispel myths about Appalachia. And Eric, an enthusiastic man with the Appalachian Citizens’ Law Center whose focus is pushing through policies to facilitate a transition away from a coal-based economy. We sat with Jeremy, an electrician in the coal mines, who doesn’t wonder if he’s going to lose his job, but when. We listened to Rutland, a retired coal miner, Vietnam vet and anti-mountaintop-removal activist, tell us the history of black coal miners in Appalachia — a demographic often ignored, if not erased, from stories about the region. Rutland sits in his home, in the town of Lynch. And then there was Herb, a filmmaker running one of the single most important institutions, as we learned, in Appalachia: the Appalshop Appalachian Media Institute. The institute has been instrumental in not only promoting film and art in the area, but also in preserving the history of Appalachia. Herb — his mustache was pretty awesome. We went to a Christmas party at a local tattoo parlor in Whitesburg where young men with follicle fortitude that would make any young urban hipster green with envy, played old-time Appalachian music, drank beer, shared their art and met with friends. John Haywood, Kevin Howard and Russ Griswald performing some old-time music. We attended a Sunday morning church service, filled with locals and travelers who sang hymns and carols together. Parishioners pack the Mt. Sinai Church in Lynch on Sunday morning. In a way that I still don’t understand, the entire experience was emotional. Maybe it’s because I saw my neighbors from northern Alberta, Canada, in the mountains of eastern Kentucky. Maybe it’s because there was a stillness in life that you can’t find in urban areas. Maybe it’s because many of my assumptions, which I didn’t even realize I had, were proven false. The Story As I’ve mentioned, we didn’t want to do a story of poverty and despair in Appalachia — but not because that isn’t there. The poverty and despair are evident as you drive through the towns of Harlan, Lynch, Benham and Cumberland. They’re evident when you cross the border into Virginia, into small towns, which at a glance seem abandoned. And yes, Appalachia is home to some of the highest rates of substance abuse in the country. All of that is part of the story of Appalachia, of coal country — but that’s not really The Story. Appalachia is a region that gave its youth and blood to build America through an industry that ultimately chewed people up and is still slowly spitting them out. But despite that pain, Appalachians remain proud of their history and what they see as their sacrifices in service to this country. There are paradoxes all around. While the region went to Trump in the election, the numbers of those who went out to vote weren’t exactly overwhelming. And none of the Appalachians we spoke to, including the one who voted for Donald Trump, believed their next president was going to improve their economic conditions and future. We consistently heard about how they had been forgotten. And we kept encountering, in an almost humble way, this incredible pride in the region’s civil rights history and the sacrifices of black coal miners. Wherever we went, we saw representations of black coal miners that weren’t just a forced tokenized nod to diversity; it wasn’t a superficial homage to their labor, sacrifice and the injustices they faced, but a real recognition of it. We consistently heard about how, after working together as brothers in the coal mines, black and white workers would emerge from the earth into a Jim Crow world that separated them based on their skin color. And that just didn’t make sense to many of them. Carl, who grew up playing with Rutland, poignantly said, “Our schools were segregated, but after, we’d play together, go eat at each other’s homes.” There are many scars across eastern Kentucky: from Vietnam, from black lung, from the constant uncertainty of being able to provide food for your family. But in spite of those scars, there are people like the ones we spoke to who are working tirelessly to rebuild their communities, protect their environment and preserve their cultures. That’s the story of Appalachia. For more, here’s the first in our three-part series:
https://medium.com/aj-story-behind-the-story/appalachia-without-the-classism-and-caricatures-c1a71d9b271
['Aj']
2017-02-07 22:19:28.474000+00:00
['Journalism', 'Appalachia', 'Energy', 'Economics', 'Coal']
The Surprisingly Simple Way You Can Make All Year Ramadan
No, the answer is not the six fasts of Shawwal. It’s surprisingly simple but getting to the answer will take a lot longer. Alhamdu’lillāh, I hope the month of Ramadān and ‘Eid-ul-Fitr went well for you all. I for one had an incredibly busy but positive month and hope you did too. I pray that all the good we did is Divinely accepted and that the light of that good remains with us for days to come. Throughout the month of Ramadān, I was asked questions on how to deal with life after Ramadān. For example… “How do we continue the good work we’ve done in Ramadān after ‘Eid?” “How do we avoid the post-Ramadān dip?” “What is the secret to maintaining the habits we had in Ramadān?” My answer surprised them and I’m going to share it with you. In Shā’ Allāh, with the memorisation of the Qur’ān in mind. It all starts with Ramadan itself: You get what you give Well that’s true but there is a twist. You get what you give when it is realistic. In Ramadān, we always ask ourselves two important questions: What is the purpose of Ramadān? — The Qur’ān, our relationship with Allāh etc What is the purpose of fasting (Siyām)? — Striving to be chosen for attainment of Taqwa, and to begin mending ourselves physically and spiritually etc These are the usual questions but we don’t tend to ask, what is our purpose after Ramadān. This is where we lose track instead asking ‘what am I going to do for ‘Eid on day one, day two, day three…’ We ignore all the other important questions. What do you want to achieve after Ramadān? Where do you want to go in life? And how are you going to get that? All of us have talents, and all of us have a type of genius, how are you going to use that and impact your world? Imagine you’re in a months long internship with myself. I give you a manual to follow from day one with all the answers in it. I also tell you what you’re supposed to achieve by the end of it. When that happens, you’re going to have a party. When the party is over, I say you did well and I provide my feedback. After all of that, it is still down to you to use the training and experience you got. You can either continue down that path or take another path. ^ That is the month of Ramadān. Before we can answer this question, our Ramadān should have been geared towards an easy transition outside of the month. Let me put this into perspective. When compared to our experience, the month of Ramadān for our beloved Prophet (ﷺ) did not involve such significant changes in quantity and quality. Especially when it comes to prayer. Yes, there were elements in the Sunnah that were increased in the blessed month but in our context, we tend to increase everything. We increase food intake as well as food waste. We increase ritual, spiritual and physical worship. This included reciting, listening and memorising Qur’ān. This included praying all obligatory (Fara’id), Sunan and supererogatory (Nawafil) prayers. This included abstaining from sin and increasing in goodwill and charity. All of which we would never do in ordinary day to day life. Imagine if I told you to do all of these things daily, for most people sadly the response would like, “come one, get real”. It is always the case that when someone takes on too much, they tend to drop it all at some point. The point of Ramadān isn’t the action, it is was exists within and around the action. This is why I had an unusual month of Ramadān (2016). I will tell you what I never did that shocked everyone I gave an answer to. I didn’t recite the whole Qur’ān, neither whilst leading the Tarawih or outside the Tarawih. Instead I recited and studied selected verses and chapters intensively and taught those throughout the month. Yes, I know. We should be doing a Khatm, we should be doing it more if we have memorised. I’m not going to shy away from the fact that I didn’t. There were reasons behind it. My entire Ramadān was geared towards an easy transition outside of the month. So my whole experience was about quality of action. I was doing roughly a page, day and night instead of a Juz night and day. If I can’t continue a page a day after Ramadān I’m trouble! :/ So when Ramadān comes along next year, (May Allāh allow us to reach it once again) prepare ahead with the rest of the year in mind. The next part of my answer was an explanation (Tafsir) of Surah al-Fātihah in the context of our world today. I will skip this part and go to the final part about practicality. Doing small things every day over a long period That’s it. That’s the answer. It’s not doing big things every day over a short or long period. Experimentation The story I’m about to share is not mine. There was a story that I put out as an experiement early this year, Feb 2016. I first shared it via email only. After that via this site onto social media. There was a catch. The thing about it was that it was all an experiment. An experiment on all things internet, numbers and much more. Shockingly, it was all based off another article published early Jan 2016. A few people got back to me pointing out that they had seen something almost identical. Whilst others I think figured out something was up. I congratulated them for paying attention. My intention was not plagirism, telling lies or to create content for the sake of content. That article was one that I had resonated with and saw potential for somebody to replicate from the core principles it laid out. Applying it to the memorisation of the Qur’ān would have made for a great test. My hope was that someone could be inspired to implement it. That from within the thousands of people in our community, one would reach out to me and say this is what I did. I didn’t get the latter but I did the former. if any of you read that article and applied it, get back to me with the results. This was the article! The practical advice that I gave to those people about taking Ramadān outside Ramadān was based off this. It turned out that both a brother and a sister did exactly this and it worked for them. Alhamdu’lillāh. How to build life changing ‘habits’ through tiny changes Abu Hurayrah (May Allāh be pleased him) reported: The Messenger of Allāh (ﷺ) said, “Take up good deeds only as much as you are able, for the best deeds are those done regularly even if they are few.” [Sahih, Ibn Majah] Our beloved, Lady ‘A’ishah narrated, The Messenger of Allāh (ﷺ) said, “…the most beloved action to Allah is the most regular and constant even though it were little.” [Sahih al-Bukhari] These narrations begin to make more and more sense. Is it now making more sense to you also? We’ve heard people recall stories about having memorised in 20 days, 50 days, 2 and a half months, and 6 months. These timeframes are not important, what is important are the lessons derived from them. A key trend in them is efficiency, smart use of time, determination and consistency. But we don’t often hear the stories of those who spent longer accomplishing the goal. They did it by doing less, consistently. Last year I was introduced to a concept called, Tiny Habits. I had also been encouraged to become certified in it by our community members. Tiny habits is about establishing (not just creating) new behaviours into your life. Rule #1: A tiny habit, according to Fogg, is a behaviour: You do at least once a day Takes you less than 30 seconds Requires little effort Tiny habits must match the criteria above because the easier the behaviour, the less it depends on motivation. Rule #2: Tiny habits are designed to come immediately after an existing habit. You use the existing habit to trigger the new tiny behavior you want. This can be built upon and applied to many things. Let’s take the four principles that help build habits. These principles have worked for two individuals I know. I will share them once again using the same source. #PrincipleONE — Start small: Repeat a tiny habit daily Do you think there’s a difference between a routine and a schedule? I think so. A routine is habitual and flexible whilst a schedule is rigid with little room for manoeuvre. What we’ll learn here is about routine. When starting to focuse on building good habits we always ask too much of ourselves. You make plans to read and complete a certain amount but end up hardly ever reading. This drags on up till you find a situation where you have too much to catch up on. This I can relate to. Let’s say you want to memorise the Qur’ān in a year! So you make a plan for a suitable routine and you get started. You’re likely doing two pages a day at least but what about the days you get real busy? What about the days you fall ill? What about the days you can’t memorise at all? You begin to fall behind on the master plan. You now change the plan again and again. This continues up till a point where you are no longer interested. This is a problem. It can take many people up to a decade going through this with no progress whatsoever. You end up failing a lot, and each failure makes it harder to succeed the next day. There is a way out. At their heart, as James Clear explains, habits are about routines. You need small wins and visible progress to help create new routines that can keep you at it every day. The idea of starting small. Focus on repeating the habit every day, but do not worry about how effective the habit is. For example: You want to memorise two pages of the Qur’ān every night, but you’ve never done any memorisation for years. If you take up memorising large chunks out of the blue and expect to do so every night, you might not last too long. It’s a big ask for most. When I was memorising, I went through three phases. In the first two phases, I gradually built on the amount I memorised and only reached three pages or more in the final phase. Starting small is so effective. Take the tiniest part of the habit you can work with — in this case, it would be to memorise just one verse. It’s still considered memorisation, but yes you won’t make huge leaps. Far too often people complain to me that they wish to memorise fast with large chunks. They dismiss the idea of starting small. Sometimes people learn the hard way but I recommend everyone begins with a litmus test. Here’s where it gets powerful: at first, you focus on just memorising one verse every night. And you stick with it for more than a week. Then, more than two. Then three, four weeks. You can stick with this habit because it’s so easy. There’s barely any effort involved with memorising one verse. Unless it’s one of the large verses which you’re unlikely to be starting out with. So it’s hard to make an excuse not to do it. And once it’s become easy and automatic to memorise one verse, you start memorising two. For a while, you memorise two verses every night. Then, you increase to three. And slowly you work your way up, never taking such a big leap that it becomes a chore. By starting small, you focus on making the behaviour automatic, before you worry about making the behaviour big enough that it produces a useful outcome. As Scott H. Young says, we tend to overestimate how much we can get done — especially when we’re stepping into the unknown. Scott suggests planning as if you can only commit 20% of the time and energy you’d like to, to be more realistic. Example: One page a night Starting by reading just one page of the Qur’ān every night before bed. If all you can manage was one page, you would count that as a win. Start off with just recitation and translation. No memorisation. Later, when the habit is strong, put on a timer and begin to memorise for 15 minutes. Eventually, begin memorising for 30 minutes before bed and another 30 minutes most mornings. Note, she you are not memorising a set amount here. You memorise what you can within the time frame. This way you are not trying to cram things in. Focus on a page at a time but memorise a verse or two. Just starting with a couple of verses or one page began to build up. Year one you could be memorising between 1–3 verses. Year two, a page. Year three, two pages. The idea is that you work on the habit, thinking about how much you ned to do to count a small win. The focus is small, and daily efforts. Do you remember the scenario above? The individual that set him or herself an ambitious daily target and kept failing. Daily habits can develop into big wins. During the same time that an individual struggles with no progress, you can make small steps towards the goal and end up doing more. There are others who stick to small amounts and remain there without incremental increases. Memorising three verses a day every day without fail, with no concern of when they’ll finish. But they still end up achieving the target before those trying to do it in chunks. Example: One lesson every morning Many people begin studying Arabic, but aren’t good at sticking with it. It happens with memorising the Qur’ān too. Often people get overwhelmed by the Arabic grammar. Others get scared of the amount to process and memorise with all the different forms and tables. You can start by building a habit of doing just one lesson every morning. Utilise free apps like Memrise and Anki droid. One lesson can take around five minutes, so it’s a tiny commitment, and quite easy to do when sitting, lying down or drinking. Eventually you’ll start doing more than one lesson — two, three, sometimes even four or five, if you are enjoying it. Never because you want to rush through it though (you need to keep things natural). You can do as many as you like, but always do at least one to check off that habit for the day. This makes it easy to stick to, even when you don’t feel like doing any more than that. Then you can take on a course to brush up on grammatical rules and structures, and finish several books on Memrise. #PrincipleTWO — Focus on one habit at a time One of the hardest things when it comes to building new habits is to not take on too many at once. We always have grand plans for the things we want to get better at, and so much enthusiasm when first starting out, that you want to build several habits at once. Every time you try this approach, you end up failing. Sometimes few of the habits stick, but sometimes none of them do. It’s just too much to focus on at once — a bit like multitasking, where your brain has to switch contexts, because you can’t focus on many things at once. So a rule is to work on just one habit at a time. Only when that habit is so automatic that one can do it every day easily. It’s then and only then, do you start on a new habit. With the examples above, reading every night before she starting to focus on a new language. Beginning to do one lesson easily every day before started focusing on getting up early etc. Sometimes building a habit can take a long time. Getting up early can be a struggle to do consistently. You can spend months focused on the same habit: trying different approaches, tracking progress, and reporting to friends/mentors/teachers who help keep you accountable. When you want to make it a consistent habit, that means not building any other habits for months. How long it takes you to build a habit will vary. We often hear the idea that it takes 21 or 40 days to build a habit, but studies have shown we all take different lengths of time to build new habits. In one study, the average time it took to build a new habit was 66 days — about two months. This said, others may differ. You may want to work on two habits at the same time (but at different times during the day) which may also work. The lessons: Treat each habit differently, depending on how hard you find it to stick to consistently. Focus on just one habit at a time so it gets your full attention and energy. #PrincipleTHREE — Remove barriers: Have everything you need at hand People find it much easier to complete these habits when the equipment needed was at hand. For instance, having your phone in hand already while drinking makes it easier to build a habit of doing a quick Arabic lesson at that time. Reading a page of the Qur’ān every night became a lot easier when keeping a copy by your bed. Malcolm Gladwell calls this the tipping point. It’s that small change that tips you over from making excuses to taking action. The tipping point is that tiny change that makes it easy enough to take action that you’ll actually follow through. It’s removing any barriers that make it easy to not follow through on any habits. This always happens. We get lazy, if the Mus’haf is on a high shelf away from you, you’d likely not touch it often. If it were next to you, in your face — you’d be likely to follow through to reading. I’ve been trying to build a habit of writing more often. I wrote, “The Promise of Ten: How an ordinary person can memorise the Qur’an in 6 months” recently. But this was never in the plan, amidst running various organisations I’ve tried to make time for writing “How We Memorised” and “25 Methods”. Right now I write whenever the mood strikes me, or when I get time, which isn’t enough. I notice that I tend to write more often when there are less distractions and clear accessibility. Right now I’m mostly on my laptop, but I’ll always end up multi-tasking. The main desktop is often always occupied by my brother where I could easily sit down and write a little without the distractions of my laptop. Another habit I want to focus on this year is exercising more. This has always been a challenge for me. There’s always a excuse but if I put on my sports/exercise gear, it’s pretty much certain that I’ll go outside for badminton. If not for other activity but until those clothes are on it’s a lot easier to think of excuses for not going out. #PrincipleFOUR — Stack habits: Build new routines onto existing ones Stacking onto existing habits builds up several habits into a routine. Each habit acts as a trigger for the next one. The cool part about this is you already have lots of habits you probably don’t realise. Brushing your teeth before bed, checking your phone before getting out of bed, getting out of bed in the morning at the same time every day — these are all existing habits. So long as you do something at the same time every day without thinking about it, it’s a habit you can stack others onto. If you do your new habit after completing an existing one, you can rely on the strength of your existing habit to help keep your new habit on track. For example, when you get out of bed, the first thing you might do is go to the toilet and brush your teeth. When you do that, you can start building a habit of making wudu’. The existing habit of brushing the teech acts as a trigger to completing wudu’. When you go to bed at night, open the Qur’ān sitting by your bed. Interestingly, I had the same set-up whilst I was in Cairo. Getting into bed and seeing the book acts as a trigger to do your nightly reading. Research has shown a cue to work on your new habit may be the most effective way to ensure you stick to the habit long-term. When you stack habits, you use the existing ones as cues for each new habit you want to build. Over time you can keep stacking new habits onto your existing ones to take advantage of automatic behaviours you’re already doing. Hifz/Hifdh of the Qur’ān is not a now task, but it’s a gradual task. Allah loves the small and consistent, perhaps that’s why there are blessings. Takeaway: How does this apply to Ramadān? The question earlier was how can we maintain things out of Ramadān. The takeaway is that it was not designed to make you perfect but to give you a path. Take one or two things out of the month and develop them as a tiny habit up till next Ramadān. May Allāh grant us such ability. Remember me in your prayers. Sukran. Qāri Mubashir Anwar
https://medium.com/how-to-memorise-the-quran/the-surprisingly-simple-way-you-can-make-all-year-ramadan-619621a039d
['Qāri Mubashir Anwar']
2016-07-15 10:26:46.187000+00:00
['Islam', 'Habit Building', 'Productivity']
Mother in Heaven
Mother in Heaven How Mormonism Makes Sense of the Feminine Divine © “HER.” Oil on wood. 18"x14". 2017. by Eliza Crofts https://elizacrofts.com/post/168270817912/her-oil-on-wood-18x14-2017 The Church of Jesus Christ of Latter-day Saints, (Mormon Church), has many doctrines and practices that make the faith unique. One of these doctrines is the belief in an embodied, Heavenly Father who is himself a divine or “exalted” man. Joseph Smith, the faith’s founder, said in a sermon, “If men do not comprehend the character of God, they do not comprehend themselves”, and Parley P. Pratt, an early apostle in the church wrote, “God, angels, and men are all of one species, one race, one great family”. This belief in God as literal father of the spirits of human beings is the back drop that informs all other practices in the church. The church emphasizes family relationships and conducts sacred “ordinances” or rituals that “seal” or bind families together for eternity. The sealing of a man to a woman as husband and wife is considered the highest ordinance and privilege in the church and is considered necessary to receive eternal exaltation. In an address to the general church membership in 2008, then apostle, now prophet and president of the church, Russel M. Nelson said, “No man in this Church can obtain the highest degree of celestial glory without a worthy woman who is sealed to him. This temple ordinance enables eventual exaltation for both of them.” The doctrine of eternal sealing was not new when he taught it in 2008, and early Mormon thinkers and leaders had grappled with the implications of this including one of Joseph Smith’s wives, Eliza R. Snow who penned a poem titled, “Invocation, or the Eternal Father and Mother”, which is now sung in the church as the hymn “” and reads in part, In the heav’ns are parents single? No, the thought makes reason stare! Truth is reason; truth eternal Tells me I’ve a mother there. While the idea of a literal Father God who is of the same species, race, and family as his earthly offspring is certainly unique to Latter-day Saint theology, the doctrine that a literal Mother God lives and loves her earthly children is even further from the beliefs of mainstream Christianity. While this belief in a feminine divine who rules alongside the masculine divine initially sounds progressive, although heteronormative, the church doesn’t do much to flesh out the doctrine of a Mother in Heaven. Beyond the occasional mention of “Heavenly Parents” by a current church authority or declaration or a handful of lines in a single hymn, the church’s belief in a Mother God appears nowhere in Church policy, practice, or doctrinal treatise. A handful of scholars within the church have worked to research and compile church teachings about Heavenly Mother. The foremost of them, Rachel Hunt Steenblik , conducted research full-time for the church-sponsored Brigham Young University, and her research was utilized by the church as the largest contributor to an essay published in October of 2015 on the church’s website titled, “ Mother in Heaven “. The essay notes, “Latter-day Saints direct their worship to Heavenly Father, in the name of Christ, and do not pray to Heavenly Mother”. This shows, as does the subtle wording of the above statement from Russell M. Nelson, a clear distinction in the way the church views men and women, which includes divine men and women. To repeat, “No man in this Church can obtain the highest degree of celestial glory without a worthy woman who is sealed to him. This temple ordinance enables eventual exaltation for both of them” (Emphasis added). The church’s teaching of eternal Heavenly Parents does not create a new egalitarian theology of the divine, but instead maintains a complementarian, patriarchal theology in which maleness presides over femaleness. It is here where I interject with my own experience as a member of the church. I love the faith that I belong to because it taught me to have an intense and personal relationship with deity. The church’s doctrine, to borrow from Joseph Smith, “tastes good.” Some of the unique doctrinal points are problematic, like the church’s teachings on polygamy, but other doctrines provide a framework that makes me feel connected to my fellow human beings and fill me with a desire to do and be better, which is why I am saddened and frustrated when the church chooses to ignore or minimize the singular and enriching doctrines that could create, through their very existence, a more inclusive and equal society within the church in favor of bland, unoriginal, patriarchal norms that exist in the more mainstream western religions. The church’s complementarian view that men and women are equal but serve different roles often emphasizes providing, presiding, and protecting for men, and household and child-bearing duties for women. Repeated teachings from church leaders in recent years have doubled-down on this view by insisting that motherhood for women is their “highest and holiest calling”. They teach that, whenever possible, mothers should stay at home full-time to be active in the raising of their children, but they then teach that our Mother in Heaven is not to have any contact with her children through prayer. I, like others in the church, have chosen to seek a more personal relationship with my Mother in Heaven, and I do not believe that she or our Father in Heaven wish for her to remain silent and absent from the lives of her children, so I pray to them both because I believe that they understand that neither of them is better than the other. I do these things because I do not believe that the 1950’s gender roles that the church is pushing are correct or eternal, and they are certainly not original or unique.
https://feministmasculinity.medium.com/mother-in-heaven-34ec95c0edc3
['Alan Parley Buys']
2019-07-05 02:05:16.884000+00:00
['Mormon', 'Religion', 'Spirituality', 'Feminism', 'Equality']
Explain Like I’m 5: Big O Notation
Explain Like I’m 5: Big O Notation Making a sometimes intimidating concept simple Photo by Markus Spiske on Unsplash. Hi there. This is the first in a series of articles in which I will be covering computer science concepts in simplified, easy-to-understand terms. From my own personal experience, despite the enormous amount of resources out there, it might be difficult as a beginner to learn new programming concepts. This is because many sources often explain them using overly technical jargon. So my goal here is to break things down to analogies/everyday examples as best as I can in order to serve any beginners out there who might be struggling to understand these core concepts. Let’s jump into our first topic: Big O Notation. Consider an example with me: You are volunteering at a big conference and you’re responsible for giving each attendee their name tag when they show up. You are given a pile of unordered name cards. How will you go about this task? The first and terribly inefficient way to complete this task is that for every person who shows up, you look through every single card in the pile until you find their name card. The second way, perhaps that which you just thought of, is organizing the cards in sub-piles based on the attendees’ last names. This might take up a bit more space on your table, but then all you would need is the attendee’s last name and you could retrieve their card in a much faster way. In this example, you can see that there are many different ways to solve a problem. Two things we consider are how fast a solution is and how much room a solution takes in order to determine if it is the “best” option. In measuring how “fast” something is, we can literally time the exact amount of seconds or minutes, but this may not be the best way to go. In the first solution where you are going through every single card in a name pile to find a name, you might get lucky and find it immediately, whereas other times you might have to take time to go through almost the whole stack to find it. So measuring the actual time to determine the solution’s effectiveness is not the way to go. Instead, we can explain effectiveness using Big O notation. In short, Big O answers this question: As the size of the input increases, how does an algorithmic solution perform in terms of time and space? In the example above, what if the conference now takes in an additional 1,000 people? How would your first solution compare with the second? Hint: It’d be even worse! The following are some common time complexity cases and real-world examples to help you think about them. In these examples, n represents the size of the input. Note: You can find many examples of analyzing complexities of actual code on the web, so I won’t go over them here. My favorite lesson is from Alvin from Coderbyte.
https://medium.com/better-programming/explain-like-im-5-big-o-notation-d16b53f03a4c
[]
2020-11-17 16:34:37.983000+00:00
['Algorithms', 'Coding', 'Software Engineering', 'Computer Science', 'Programming']
The Absurd Reason ‘Covid-19 Is a Lie’ Ended Up Trending on Twitter
The Absurd Reason ‘Covid-19 Is a Lie’ Ended Up Trending on Twitter It wasn’t because people believed it People gathered at Huntington Beach, CA, to protest coronavirus closures on Friday, April 17, 2020. Photo: Jeff Gritchen/MediaNews Group/Orange County Register/Getty Images On April 17, a protester in Huntington Beach, California, was photographed holding a white sign that said, in ultra-bold capital letters, “COVID-19 IS A LIE.” The protester was clad from head to toe in personal protective gear. The absurd dissonance between the protester’s words and actions made the photograph a hit on Twitter, where critics of the Trump-backed “liberate” protests circulated it as an example of the movement’s hypocrisy. “Can’t make this stuff up,” tweeted progressive political YouTuber Bryan Tyler Cohen, drawing more than 20,000 favorites. “Okay I’m done,” another viral tweet deadpanned. “Conservatism in 2020: Whatever is smart, we’re against it,” snarked another. People had such a field day sharing the image that the phrase “Covid-19 is a lie” began to climb the ranks of Twitter’s trending topics, cracking the top five in California. Now the phrase “Covid-19 is a lie,” sans context, was pinned to the Twitter homepage for millions of users. Naturally, many assumed that claim was trending because people were tweeting it in earnest. They added their own exasperated tweets to the pile, driving it further up the charts. It’s the latest example of a phenomenon that threatens to undermine Twitter’s efforts to contain an outbreak of coronavirus misinformation, while making the anti-science movement seem more influential than it really is. To address it, Twitter would have to invest more in a curation team that often seems overmatched by the algorithms it’s trying to rein in. Twitter, like Facebook and other social networks, has taken a hard line (by social media standards) against coronavirus misinformation, implementing more aggressive policies on the grounds that falsehoods about a pandemic can be matters of life and death. And yet here Twitter’s own software posted a blatant and dangerous falsehood for all to see. Anyone who clicked into the topic could glean that it was mostly making fun of that falsehood, but the popularity of tweets railing against it makes clear that many users did not. This is not the first time Twitter’s trending topics have played this role. In the 2020 U.S. Democratic primary, divisive hashtags such as #NeverWarren trended nationally, fueled in large part by tweets expressing outrage at the hashtag. The more popular it became, the more people saw that it was trending and expressed condemnation, inadvertently stoking the trend. The trending algorithm’s “circular logic,” as Whitson Gordon put it in OneZero last year, can have real consequences. Bernie Sanders’ campaign was dogged by criticism of his online supporters’ boorish behavior, and while some of that was real, the backlash-driven trending topics made it seem more widespread. In turn, some media outlets treated the trending topics as newsworthy in themselves, bringing the narrative to wider swaths of the public. On April 18, the phrase “fire Fauci” trended for similar reasons, and the hashtag #FireFauci has trended separately on two other occasions. Some of the initial “fire Fauci” tweets were from right-wingers calling for the top doctor’s ouster from the Trump administration. But once that ball got rolling, it gained momentum from Fauci defenders who were furious about it. In the most recent instance, three of the top four tweets containing the phrase were criticizing it, and two of them were arguing that it was tantamount to endangering lives. The irony, of course, is that they were more responsible for its trending than most of the people who actually believed it. “It’s like some weird autoimmune response, where the response far exceeds the need to respond.” Some researchers have labeled this phenomenon “accidental amplification.” But that makes it sounds like a careless mistake by users, as opposed to a vicious cycle that’s essentially built into the platform. Renee DiResta, research manager at the Stanford Internet Observatory and an expert on misinformation, sought a better way of describing it. “It isn’t quite rage-trending or meta-trending,” she mused. “It’s like some weird autoimmune response, where the response far exceeds the need to respond.” When it happens, DiResta said, it opens a marketing opportunity for charlatans and opportunists who can reach a wider audience by glomming onto the trend. For instance, the anti-vaxx doctor Shiva Ayyadurai — who is also notorious for claiming to have invented email — has repeatedly used the #FireFauci hashtag to boost his own profile among the anti-science crowd. And the top tweet in the most recent “fire Fauci” trend came from Rashid Buttar, a doctor in North Carolina who pushes chelation therapy as a cure for autism and cancer. Those doctors already had their own sizable Twitter audiences, but a top tweet in a trending topic can bring a windfall of new followers. And the topics create the perfect Venn diagram for someone like Ayyadurai to loop anti-Fauci sentiment into the broader anti-vaccine movement. Twitter says it’s aware that trending topics can be fueled inadvertently by counter-speech and is working on ways to better signal that to users when it happens. “We’re working to add more context to trends so the full story/situation is easier to see up front,” spokesperson Liz Kelley tweeted in response to a tweet from New York Times reporter Mike Isaac on Friday. Twitter clarified to OneZero that it already adds context to some trending topics. When the trend revolves around a particular link, that link will appear directly beneath the topic’s name, inviting users to click through to the original story. Other times, Twitter will create a Moment — a curated series of tweets on a given theme — to go with a trending topic, and that Moment will appear under the topic name. A spokesperson acknowledged that didn’t happen in the case of the “Covid-19 is a lie” trend. At root, the problem is a familiar one for Twitter and other internet platforms: The humans they employ to curate and moderate are overwhelmed by the speed and scale of their software and user base. One way to fix that is to hire more humans. Another is to constrain the software. In 2016, Facebook faced a firestorm over charges that its human editors were applying a political slant to the stories they selected for that social network’s “trending” module. It responded by firing the journalists who had been doing that work and relying on software to surface trends without any added context. The results were glaringly bad and at times laughably inscrutable; eventually Facebook got rid of its trending section altogether. That path may have been tenable for a company focused on connecting friends and family. Ditching trending topics seems less likely for Twitter, whose core purpose is to keep up with current events and memes. So it makes sense that Twitter would focus instead on improving it. As is often the case with Twitter, it just doesn’t seem to be moving very fast. The company declined to say how many people are on its curation team, but it seems safe to say it isn’t nearly enough to manually contextualize every trending topic in every country, or even all the problematic ones. A more realistic solution in the short term might be to prioritize the most sensitive types of trends, such as those pertaining to elections or public health, over those that involve sports or entertainment. And in the medium term, don’t be surprised if Twitter finds more ways to automate at least a semblance of context, such as by surfacing related topics or showing which accounts are driving the trend. There might also be hybrid solutions that allow humans to add context more quickly, such as finding and pinning one explanatory tweet to a topic rather than building a whole Moment around it. Adding context does seem like the right approach, DiResta said. If people could see at a glance that “Covid-19 is a lie” was trending as an example of a protester’s stupidity, they’d be spared a spike in blood pressure and the compulsion to tweet their own outrage at the trend. At the same time, opportunists might be deterred from trying to capitalize on the topic by associating themselves with it. And no one would be misled into thinking that “Covid-19 is a lie” is a more popular sentiment than it really is.
https://onezero.medium.com/the-absurd-reason-covid-19-is-a-lie-ended-up-trending-on-twitter-3943b978287c
['Will Oremus']
2020-04-22 05:31:00.888000+00:00
['Twitter', 'Social Media', 'Tech', 'Misinformation', 'Coronavirus']
How Schools Adjusted to Life Under COVID-19
As parents, teachers, and students learned the hard way this year, not many schools had a plan in place to guide them through an extended school closure. When COVID-19 emptied classrooms from coast to coast, it was — as one school official in Maine said —“literally like building a new educational system overnight.” That’s the main story, but not the whole story, from a RAND survey of school principals in the immediate aftermath of those COVID-19 closures. Some districts, in fact, had thought through what an extended closure would look like, and how they would keep students engaged and learning at home. Principals in those districts were much more likely to say they expect no drop-off in student performance. “Schools need to get serious about getting these plans and their technology infrastructure into place,” said Heather Schwartz, a senior policy researcher at RAND who focuses on pre-K to 12 educational systems. “During a disruption, what are you going to do? It could be a fire, it could be flooding, a hurricane, it could be influenza or some other type of pandemic. You can’t just go buy 1,000 iPads and think you’re done.” She and other researchers at RAND had raised concerns in recent years about whether schools were prepared for the disruption of a long closure. When hurricanes swept through the Houston area in 2017, for example, they tracked more than 1,000 schools that had to close for ten or more days. Few offered any kind of distance learning. Then came the total school lockouts of COVID-19 spring. Within the span of a few weeks in March and April 2020, nearly every school in America had to figure out how to make distance learning work. Some handed out thick packets of homework for students to do on their own. Others handed out laptops and mobile hotspots. To better understand how schools were adjusting to life under COVID-19, researchers turned to a unique panel of experts, the RAND American School Leader Panel. The panel provides a pool of school principals who participate in periodic surveys on school issues; a separate but similar panel covers teachers. Researchers were able to ask nearly 1,000 principals about what they were experiencing on the ground. School Preparedness Levels Before the COVID-19 Pandemic Other important preparedness indicators — such as the percentage of students who have home internet access — exist in addition to the five preparedness indicators. In the survey, for example, only 55 percent of principals said that 90 percent or more of their students had access to the internet at home during the pandemic. Data on how many of these students had access to the internet at home before the pandemic began was not collected. Nearly two-thirds of the principals said their schools had provided laptops or other computer devices prior to COVID-19, at least to students in need. Yet fewer than half said they had trained teachers to deliver online instruction or were offering fully online or blended courses. Only one in five said they had made plans before the pandemic to deliver instruction during a prolonged school closure. “That’s the number that I find most concerning,” said Melissa Diliberti, an assistant policy researcher at RAND, a Ph.D. student at the Pardee RAND Graduate School, and the lead author of the report. “It really shows that few schools had this concept of a prolonged closure in mind as something that could realistically happen. I hope that number is going to be a lot higher now that schools have had to live through this.” Only one in five principals said they had made plans before the pandemic to deliver instruction during a prolonged school closure. The more prepared a school was, the researchers found, the more likely it was to continue giving letter grades during the pandemic. The most prepared schools, in fact, were 20 percentage points more likely than the least prepared schools to stick with letter grades. And principals in schools that had a closure plan in place were less likely to say they expected a lower level of student achievement this fall than they saw last fall. Large schools were more likely than small schools to have taken some preparedness steps pre-pandemic, such as training teachers to deliver online instruction. Middle and high schools were better prepared than elementary schools. And schools with high numbers of students receiving free or reduced-price lunch, a marker for poverty, were just as prepared, or unprepared, as schools with lower numbers. The results point to a striking need for better planning at schools — and to the difference that could make in the lives of millions of students. “COVID is going to leave a lasting impression on everyone who’s alive right now,” Diliberti said. “We’re all going to have a sense that this might not be the last time that something like this could happen. Schools and districts are going to have to respond with plans to address long closures — not if but when they happen again in the future.” Most principals in RAND’s survey didn’t seem to argue with that. Fully 85 percent of them said one of their top priorities when schools reopen will be planning for the next big closure. — Doug Irving Philanthropy in Action Funding for this research was provided by gifts from RAND supporters and income from operations.
https://medium.com/rand-corporation/how-schools-adjusted-to-life-under-covid-19-14af78559627
['Rand Corporation']
2020-12-07 19:12:41.890000+00:00
['Education', 'Distance Learning', 'Emergency Preparedness', 'Coronavirus', 'Covid 19']
How to Eat the Best Fruit (and Not the Bad Fruit)
After having a mild stroke each time my boyfriend brought home the wrong type of fruit, I created an index card for him to take to the shops. (His solution was for me to go instead — be the sole fruit winner of the family — but this is not realistic.) And so, the extensive guide below was born. Yellow stone fruits are literally summer, if summer swooped itself into fruit form, and so you must pick them always from a fruit line-up and smoosh them into your face. 2. Pick the yellow peach that looks like a sunset with its red, orange, and pink coat skin, peel it off with your teeth. Sink them into unripened sunflower-coloured flesh. Wait for the crack — because yes, they’re better hard! Soft is messy and only permitted when nothing harder exists or when you have forgotten them at the bottom of a fridge drawer. 3. Always opt for the unripened yellow nectarine, the one that is red with flecks of yellow, it’s the Spanish flag. It should be as hard as the ball on a pool table, knock ’em out. 4. DO NOT BUY THE WHITE PEACH OR THE WHITE NECTARINE EQUIVALENT, FOR THE LOVE OF GOD. These literally taste like nothing. You could open your mouth like a goldfish and the air would taste the same as a bland white peach. Do not eat nothing. Do not waste this moment. Why is it white? Who segregated this fruit? I will never eat it. 5. The mango is perfect in that it is always yellow and if it’s not, I don’t want to hear about it. The mango’s only flaw, and it’s a minor one, is the effort it sometimes takes to undress the mango, carve it up in a way that makes sense, and find its way to the mouth. This is an easy thing to rectify with a knife or with teeth, a plate or napkin and always, always when it is ripe enough that the juice smothers every inch of you, the couch, the floor around you, the utensils, the kitchen, the person next to you, the baby, the whole house. It is sticky and sometimes tastes chemical, a sharp, crisp pang, like fresh linen after it’s been washed by an aloe vera fabric softener. This one is for the thrill seekers, the wanderlust among us, a mirage in the desert. 6. Grapes of wrath are always green. Green grapes are war crimes. They are almost always soft and mushy, like a dead arm after it has been punched, limp and lifeless, squishy. Green grapes are just the uncooked eggs of white wine and should not be wasted on a fruit platter where it has no business being. 7. Light red and firm seedless grapes after they’ve been in the fridge so long they’re crisp and crunchy. My oh my what beauty lay in thine purple hue? Red grapes can get it. The first of their name, breaker of chains and stems, Queen of the Andals and the first grapes. Crisp, hard red grapes deserve their place in the canon of being fed to rich people while sitting on a throne and being fanned by a giant feather. So light and airy, so cool on the tongue. 8. Do not eat big soft round purple grapes with seeds. Do not do that to yourself. You are a baby who can’t eat seeds. When you bite into the grape, a liquid should not come out, this is not porn. (Number 7 is, however, porn. Please keep up.) 9. HOW DO WE PLUM? HOW, YOU ASK? PLUMS. SO COMPLEX. SO MANY OPTIONS. NEGATORY. THERE IS ONLY ONE OPTION AND IT IS THE UNRIPENED BLOOD PLUM. The Samsuma plum. LISTEN TO ME. IT IS LIGHT PURPLE WITH TINY GREEN SPECKS LIKE THE FLICKERS OF GRASS sprayed with a burst of water ON A MISTY MORNING. IT MUST BE FIRM. YOU BITE INTO IT. THIS IS LIVING. SWEET BLOOD RED COLOUR REVEALED. AM I BITING INTO MYSELF? MAYBE. OH THE TART SOUR OF THIS PLUM HAS RUINED ME. 10. The violet hued purple plum that is yellow on the inside is two plums sitting in a trenchcoat pretending to be a blood plum, DO NOT BE FOOLED BY THEM. 11. I need to go back to the blood plum, I’m not done yet. The dark glossy red wine round fruit, sweet with only a flicker of tarty sour, so hard to chew all the flesh off from the seed. When we were kids, we used to wait for the plums to go from light green to red on the plum tree. My brother would pick them while they were still light green and sour as all hell and take so much glee in torturing himself and others with the extreme sour. It was a test of endurance, a sign of better things to come. 12. Figs are highway robbery. At like, $2 each for a fig, it has become a luxurious decadence in Australia, unlike in Greece where they sell whole bags for two euros that you can eat on autopilot while watching the sunset on an idyllic island. In Australia they’re mostly used in salads, but I’ll still sneak one after dinner, ripe or unripened, it makes no difference to me. Figs are ancient, ripe and fleshy, primal and female. My cousin’s wife’s family sells big boxes of them from their home in Glenorie. You probably know the ones. They’re famous. 13. My dad once said you had to eat watermelon before any meal so it could slide through your body, making way for your food, like a bright pink-red carpet. Watermelon should be crisp and sweet, seedless and firm, not soggy. Watermelon is heavy, the only downside to an otherwise perfect fruit specimen. Watermelon is the meaning of life. I would eat watermelon in Barcelona the same way I would eat watermelon during summer in Coffs Harbour, outside on the grass, face to the sun, sticky juice dribbling down my chin. 14. Big, round, more reddish than black and firm cherries. I love cherries, their little stems, their little seeds, the way it’s so easy to plop one in your mouth and spit it back out again, the circle of life. Piles and piles of cherries. Nestled on a pavlova. Pitted against one another. Plucked. They’re good for you in concentrated juice form too, but few people know this. 15. I prefer the small, seedless mandarins which are easy to peel and marketed towards kids (the Frozen-branded mandarins are particularly infuriating to my boyfriend). The skin is thin and comes off easily. Peeling each piece off is like a small victory, so tiny on your mouth. I always eat two at once. 16. Oranges — they’re all good. What do you want from me? It’s an orange? I prefer them in juice. Tangy. Fresh. 17. Bananas: unripened, green. I used to eat them in Kellogg’s K like a maniac, always ripe. “You hate cereal and milk,” my mum would say. “Yeah but I love them with bananas for some inexplicable reason?” Now I eat them all the time after a friend said eating one banana once a day would cure you of any malady and honestly, she’s never been wrong, they are healing and magical. 18. Ahh, strawberries. Big and firm and unblemished. I particularly love the ones that have, like, horns at the end, three horns, you know what I mean. Covered in dark chocolate that sets over them, mummifying them. What makes that combination so good? It’s like crack, you bite in and it cracks open, I mean. 19. Raspberries have to be firm and fresh, tangy and piquant, never soggy, otherwise chuck it in the processor. They’re also ridiculously expensive in Australia. Women love them. I’m okay with this. It’s the illusion of treating yourself. 20. Blueberries — the king of the berry — firm and tart, please. So round and bulbous. I love them. You can eat as many as you want and it’s like eating none at all. Punnets on punnets on punnets. My nephew was obsessed with blueberries as a baby. He even went blueberry picking with his parents because he’s adorable, and he had the time of his tiny life. I once shared a punnet with him and, no word of a lie, he grabbed a blueberry from me as I came to place it in my mouth and placed it in his own so fast that I almost didn’t notice. He took “steal from the hand that feeds you” to heart. Is that how the saying goes? 21. Our grandfather (may he rest in peace) used to have a mulberry tree out in a vast, expansive garden that housed all the fruit trees you could think of. But the mulberry tree was wild. When it sprouted fruit, us rambunctious kids dashed through the garden which was like a forest to us, carrying our buckets, climbing over one another to pick as many as we could, careful not to get the juices all over our clothes (we always did). The excitement of spotting a big fat one was palpable. Our feet stained a blood red colour, like the scene of a crime. I’d see mulberries in the grocery shops and get excited almost as a reflex, but it was never the same. They could never be as fat and juicy. They never tasted like a small victory. 22. I didn’t know what blackberries were until my housemate told me they were his favourite of the berries. Skeptical, I tried one. It was good. How it was different to mulberries, I could not tell you. I then started buying them all the time. Only I’d forget to eat them and they were never tasty or satisfying after a few days. It was my first lesson in eating tart fruits right away. I’ll never forget it. 23. Pomegranates. Run these tiny jewels into your mouth, plop them into your salad, add them to your yoghurt, whatever you want! There’s no way to ruin a good Pom. There are so many ways to open and unleash your ruby red grains onto the world. Smash one half with the blade of a knife, then turn it inside out. Pick each jewel one by one from its white fleshy cave. Cut it open and open it in a bowl of water, the white bits floating to the top. Cut it a certain way so it all plops out. Even when it stains your white kitchen bench for all eternity, you ain’t even mad. 24. ALL LYCHEES ARE GOOD except for those watery ones that taste like a wet shoe. The fatter the better. They may be hard to peel but the wait is always worth it. Plop one in your drink. Plop two. They’re like jellyfish for your mouth. Your teeth finally land on that oval-shaped smooth seed and you’re sad that the experience is over. 25. My uncle always chastises me for buying persimmon at the city supermarket, where they’re expensive, like $3 each, when he has whole boxes of them from his garden that he insists I take each time I visit. I’m always shocked when I see persimmons in the supermarket. They appear and I forget all about them each time. My strongest memory is peeling them in my Sita’s kitchen and being shown how to eat them. Like this? Just dig in? Are you sure? You always know when you’ve got a good one but I couldn’t tell you how to pick one out when faced with them at the shops. I think what I’m trying to say is that I don’t think Coles, Woollies, or even Harris Farm can cut it anymore. I have to go home where the good fruit lies on the side of the road as you drive through Dural and Glenorie, boxes and boxes of goodness all lined up, waiting for you.
https://medium.com/s/story/how-to-eat-the-best-fruit-and-not-the-bad-fruit-af73f4b74d07
['Sheree Joseph']
2018-02-17 19:55:01.711000+00:00
['Lists', 'Memoir', 'Fruits', 'Food', 'Writing']
Interactive Neural Network Fun in Excel
Constructing CPPNs in Excel with PyTorch and PyXLL After reading Making deep neural networks paint to understand how they work by Paras Chopra I was inspired to delve into some experiments of my own. The images produced were intriguing, and I wanted to play around and get a feel for how they changed in response to changing the structure of the neural network. I encourage you to go back and read the original article, but as a brief summary it involves using Compositional Pattern Producing Networks (CPPN) with random weights to produce abstract images. The basic idea is that we construct a function c = f(x,y) that takes the input coordinates of a pixel x,y and returns the colour for that pixel, c using a CPPN. Image via Generating Abstract Patterns with TensorFlow As in the original article, we’ll use PyTorch to create a CPPN. The following code is the result of lots of work by Paras Chopra and will be our starting point. Running the above neural network on a 512x512 input array results in an output image as shown below. The neural network is initialised with random weights, and the input array is scaled such that each input is between +0.5 and -0.5. Created using a CPPN with random weights There are a lot of variables to play with that will affect the resulting image. From the scaling of the inputs, the number of neurons at each stage, and the structure of the neural network itself. We could put this code in a Jupyter notebook and use change the inputs there, but each change we make is a change to the code itself and somehow doesn’t quite feel like a real interactive experience. Microsoft Excel might seem like an odd choice for an interactive playground for this type of task initially. Bear with me though… What we want to do is enter some information like the number of weights and how many layers, and have a new image presented to us. In essence, Excel simply runs functions when inputs change — and those functions can do anything, even show us an image in Excel! If we could have a worksheet in Excel with the parameters we want to change, and as soon as we make a change have the image update that would be make a great playground for getting a feel for how the inputs affect the output. Using Python in Excel Neural networks in Excel might sound like a hard task, but we’re not talking about implementing it in VBA! In fact, we won’t need to go anywhere near VBA as we can continue to use Python instead. PyXLL is an Excel add-in that embeds the Python runtime into Microsoft Excel. It allows us to write Excel functions entirely in Python, and so we can still use PyTorch for our neural network but all within Excel. Having access to all the Python tools really opens up what is possible with Excel. Rather than having complex logic encoded in VBA, software written in Python is simply exposed to Excel. Excel becomes our front end user interface tool, with Python taking care of the complex computation. PyXLL is the highest performing and easiest way to write Excel functions entirely in Python, which is perfect for complex workloads. You can download a free 30 day trial from https://www.pyxll.com/download.html. Building the Neural Network in Excel Our goal is to be able to construct our neural network in Excel and have complete control over the inputs. We’ll start by writing some Python functions and exposing them to Excel. Roughly, the functions we’ll need are: Create the layers (nn.Linear, nn.Tanh and nn.Sigmoid) Create a neural network from a set of layers (nn.Sequential) Run the neural network on a set of inputs and show the output At this stage you might be wondering how we are going to represent some of these items in Excel. PyXLL lets us pass Python objects between Excel functions in a workbook, so having a function that returns a Python instance of nn.Linear or another Python function that takes a list of transformation layers is actually very simple. When a Python object is returned to Excel, what we see in Excel is just a handle to the Python object, and when that handle is passed to another Python function PyXLL fetches the object for that handle automatically. PyXLL also manages the life-cycle of these objects so when they are no longer needed they will be cleaned up automatically. To expose a Python function to Excel we use the @xl_func decorator from the pyxll module. Here are the first functions we need for creating the layers: Exposing PyTorch functions to Excel The @xl_func decorator is all that’s needed to expose those functions to Excel! The nn_Linear function has type annotations which PyXLL uses to ensure the correct types are passed from Excel to the function, otherwise the numbers passed from Excel might come through as floats. All that’s needed is to add this module to the PyXLL config file, pyxll.cfg. For example, if your code was written to a Python module named “pytorch_abstract_art.py” in the folder “C:\MyCode\PyTorch-Abstract-Art\Modules”, you would update your pyxll.cfg file with the following settings: [PYTHON] pythonpath = C:\MyCode\PyTorch-Abstract-Art\Modules [PYXLL] modules = pytorch_abstract_art We can now call these functions from Excel to construct all the layers required required for the neural network. All of the inputs are entered in Excel, and we can even change the number and order of the layers interactively. To construct the neural network we just need another function that takes these layers and returns the neural network using nn.Sequential. PyXLL can accept arrays of arguments as well as just single values, so we can pass the whole set of layers as a single argument. The @xl_func decorator takes an optional function signature to tell PyXLL more about the argument and return types to expect. To tell PyXLL to convert the range of cells passed to it into a 1d list of objects we use the object[] type. If we wanted to pass a range as a 2d list of lists of objects instead we could use the object[][] type. When flattening from a 2d range of cells to a 1d list of values PyXLL takes cells from left to right and so the ordering of our layers in the above image will result in the correct arrangement. We can now add that new function nn_Sequential to our Excel worksheet and pass in the layers created earlier. Creating the Output Image The only things remaining now are to create an input set, initialise the weights, and show the resulting image. To do that we’ll use Excel’s Object Model to manipulate an image in Excel. Excel’s Object Model is exactly what you may have used if you’ve ever written any VBA, but you may not have realised that it can be called from Python just as easily as from VBA! The guide Python as a VBA Replacement explains how to call Excel from Python, and also covers some of the differences between VBA’s syntax and Python. The final function takes: The neural network object The name of the image object in Excel Scale and offset parameters for the inputs A random seed To add the image that will show our result, in Excel click Developer → Insert →Image (Active X Control). To the left of the formula bar you will see the name of the image object. This will default to Image1, but you can edit this to any name you like. The following code gets the image from Excel, constructs the inputs, computes the outputs and updates the image. With this final function complete we can now pull it all together. All of the inputs to constructing the neural network are exposed on the sheet, as are the parameters controlling how the input data is created. We can add or remove layers, edit the number of features at each stage and even switch the activation functions. What’s really fun about this is that everything is in real-time. As soon as we make a change, the image updates. A couple of functions were left out of the above code to make it easier to read. You can find them below, or you can also get all of the code from the GitHub repo https://github.com/pyxll/pyxll-examples in the pytorch folder. Final Thoughts Playing around with neural networks in Excel in this way has been really interesting and fun. More importantly than that though, it shows how even complex Python tasks can be integrated into Excel seamlessly. Excel’s strength is as a user-interface tool and, in my opinion, should not be used to store data, algorithms or business logic that can be better managed elsewhere. Anyone who’s used Excel for a while or has worked with a team that depends on Excel know the pain of VBA macros that can’t be tested, version controlled or even be consistent between different users’ workbooks. By moving the code outside of Excel you not only get the huge benefits of a modern language like Python, but you can also apply modern development practices to the code behind your spreadsheets. While not many people will want to actually construct and train neural networks in Excel, there is still value in being able to access neural networks from Excel. A large part of what makes machine learning useful is the ability to use it to assist decision making. If the decision makers are already using Excel, we can make it easy for them by exposing trained and tested machine learning solutions to them in Excel. This can be a much improved workflow over them using Excel to do some basic calculations, then breaking out to a custom tool or web app (or even worse, emailing someone in the data science group a spreadsheet!), and then putting those results back into Excel — which they will want to do, no matter how good you think your custom tool is ;) You can download the code and the example workbook from the GitHub repo https://github.com/pyxll/pyxll-examples in the pytorch folder. You will also need PyXLL, which you can download with a 30 day free trial from https://www.pyxll.com/download.html.
https://towardsdatascience.com/interactive-neural-network-fun-in-excel-8bad2b3a5d9d
['Tony Roberts']
2019-12-02 14:48:17.365000+00:00
['Machine Learning', 'Python', 'Neural Networks', 'Excel', 'Office']
Different Types of Visualizations
Different Types of Visualizations People learn and digest information differently. In addition to tables and (typical) bar charts, there are all kinds of ways to spice up your data. Here are some alternative visualizations to try: Pie/Circular Chart. Its simple but very powerful. The data is shown in a more shapely and informal kind of way. Additionally, the succinct layout makes it quite easy to digest the information. Flipped Bar Chart. Take your bar chart and flip it on its side. There is a lot less chart but again, the information is much more succinct and friendly. Without overloading your users, give them a few options. If you are going to go with only one form of visualization, survey some users and find whether they have a preference. Data is an awesome and powerful tool. Like anything else though, don’t let it get too complicated. Keep it simple. — — — — — — Humanizing Data Part I Humanizing Data Part II
https://medium.com/the-data-experience/humanizing-data-part-iii-3d1833508848
['Mayer Seidman']
2019-06-11 13:39:05.464000+00:00
['UX', 'Design', 'Data Visualization', 'Data']