Its death has been predicted many times. In fact, if there is one constant in the history of COBOL it may be predictions of the programming language’s death.
COBOL was created in 1959 by industry and government programmers but even then its future was uncertain.
“In less than a year there were rumors all over the industry that COBOL was dying,” said Grace Hopper, rear admiral and programmer who helped design the language in a 1981 lecture.
Google Play …. Your not helping my social distancing!
I looked and I saw that Google Play Store has used 855 MB of mobile data in the background this past month. I mean, like, what the duck? There is no need for you to be constantly upgrading apps in the background. It’s not like I’m still getting on the internet over Wi-Fi at the library or Five Rivers Environmental Education from time to time. You really don’t need to use all my mobile internet. I have 25 gig a month but that’s still a big chunk of it. I’ve blocked Google Play Store from background internet going forward. Obnoxious.
Luke makes some good points about computers getting faster -- they generally aren't for users -- because coders are getting sloppier and we are demanding more out of them all of the time. I know Luke, and so do I reject the fancy window managers that are popular, but still even if you don't use the resource intensive window managers, software in general from the web browsers to video players, are becoming more and more data intensive. Solid state drives are a bright spot in speed, but still technology just gets heavier and heavier, even as the raw speed gets faster.
Moore's Law illustration MS Tech Computing / Quantum Computing We’re not prepared for the end of Moore’s Law It has fueled prosperity of the last 50 years. But the end is now in sight. by David Rotman Feb 24, 2020
Gordon Moore’s 1965 forecast that the number of components on an integrated circuit would double every year until it reached an astonishing 65,000 by 1975 is the greatest technological prediction of the last half-century. When it proved correct in 1975, he revised what has become known as Moore’s Law to a doubling of transistors on a chip every two years. This story is part of our March/April 2020 issue See the rest of the issue Subscribe
Since then, his prediction has defined the trajectory of technology and, in many ways, of progress itself.
Moore’s argument was an economic one. Integrated circuits, with multiple transistors and other electronic devices interconnected with aluminum metal lines on a tiny square of silicon wafer, had been invented a few years earlier by Robert Noyce at Fairchild Semiconductor. Moore, the company’s R&D director, realized, as he wrote in 1965, that with these new integrated circuits, “the cost per component is nearly inversely proportional to the number of components.” It was a beautiful bargain—in theory, the more transistors you added, the cheaper each one got. Moore also saw that there was plenty of room for engineering advances to increase the number of transistors you could affordably and reliably put on a chip.
Before ubiquitous caller ID or even *69 (which allowed you to call back the last person who’d called you), if you didn’t get to the phone in time, that was that. You’d have to wait until they called back. And what if the person calling had something really important to tell you or ask you? Missing a phone call was awful. Hurry!
Not picking up the phone would be like someone knocking at your door and you standing behind it not answering. It was, at the very least, rude, and quite possibly sneaky or creepy or something. Besides, as the phone rang, there were always so many questions, so many things to sort out. Who was it? What did they want? Was it for … me?