That said, since you're here and reading this nonsense, I'm going to go out on a limb and assume that you're not of the brain-dead crab-pot postulators of such pap as "there's no such thing as wide variance in human ability at a broad spectrum of tasks, all competence can be traced to genetic predisposition to competence at the task in question", are in fact a smart cookie willing to work your ass off for meagre dopaminergic rewards in the short term, and have a passing interest and some amount of learned skills that make you, as the numbskull linked above might say, "good at computers". Perhaps someone even told you that you might eke great piles of
loot food credits from Capital's demand for more Labor to turn its ever-multiplying cranks of complexity. None of that will be enough: you'll need tenacity, a solid dose of ability to pretend that some things are true while pretending some other completely contradictory things are also true, a healthy disregard for public opinion (if you want to preserve your sanity), and a historical bent so that you may at least work to build the leg up on the rest of the market that is grokking the historical reasons behind why Android sucks so fucking bad. Moreover, you'll need practical reasons to soldier through the nonsense, like a fearless leader or the promise of untold wealth. Pure fun or "joy of it" will get you precisely nowhere, when it wears off and the slog proper begins.
Building "experiences" (barf)5 in the browser is an unmitigated disaster. You can't really understand how miserably fucked up building "apps" (as the public call anything with buttons on a screen at which they can paw with their grease-coated sausages6) in the browser is, you must understand where browsers came from and how they evolved into the shitshow you haul around in your pocket every day. In the beginning, there was plain text. Then, some people structured that text, wrapping paragraphs in <p> (known as "markup") to indicate that they should be styled as paragraphs, using other markup for eg lists, and so on and so forth7.
So take a step back and think about what your browser is trying to do under the hood: take some text, marked up with various tags, apply some visual rules to it with CSS, and then execute COMPLETELY ARBITRARY CODE to oh you know maybe rearrange the ordering of lists, or replace your cursor with a spinning dick blowing loads whenever it draws over the character V or T, or oh I know pre-validate that you put a credit card number into that form input so that we can save ourselves a round trip to our servers from the user's browser. That's totally a good reason to shoehorn an impossibly bad programming language into the browser, mhm.
Fast-forward a decade. Web sites are passé, and people want at the very least "responsive" websites, and ideally "mobile first experiences" (drink). This means that the website that once needed to render nicely and quickly at 600x800 and maybe a few larger monitors now needs to look good on the 15" Macbook Pro Retina monitor (a monster of pixels, owned by everyone involved in "experience design", that no actual customer owns and yet whose pixel-count drives ~all design considerations in the industry), the Nexus Pucksell with its trapezoidal screen, the iPhone 7XXL with nearly the same number of pixels as the 15" Macbook Pro Retina but that uses an entirely different user interface built around poking at buttons drawn on a screen rather than pointing and clicking with a mouse and typing with a keyboard, and miscellaneous 4-year old Android phones that the User Experience expert in question has around from his last job but doesn't use any more except when he wants to make his devs lives miserable. On small budget projects.
It gets worse: because everyone involved in web dev was fathered by the kinds of neutered not-thinkers that women in America must settle for and the women aren't smart enough to have the children sired by actual quality sperm that didn't come from their meal ticket out of some perverse adherence to the local traditions of Beer, Monogamy, and Sports, your website won't even feel slow to the average cell phone user until you serve over a dozen megabytes (That's rather a lot of bytes. The last time I checked, the CH homepage clocked in at a trivial sub-30 kilobytes [kilo, mega, giga...you do recall the SI names for orders of magnitude from your elementary education, don't you?]) of uncompressed JS and CSS that you probably aren't even serving to clients anyways. This means that there is zero pressure for people to build lightweight websites anymore, which pretty much guarantees that nobody building websites is even going to think about the repercussions (either performance or security!) of pulling in a library to trim whitespace off the beginning of strings. For example.
So that's "the frontend". A soul-killing hodgepodge of 3 "programming languages" (HTML for the text and its structure, CSS for an approximation of what it should look like, and JS for responding to user input like clicks and taps), executed all together by "The Browser" and differently by each browser. Obviously (or perhaps not obviously, I have no idea how much you know about how your apps work), these things running in the browser have to get their data from somewhere. That somewhere are your "backend services".
I digress. Backend services respond to HTTP requests with data from the database. Sometimes they write data to the database. Sometimes they poke other backend systems. Generally, though, they're "the pure function of the server: glorious, stateless, and without any user-interface cruft", to paraphrase a man who taught me much. Backend systems, eschewing as they do the complexities of managing user interface state are far simpler systems to build and maintain that clicky-pointy-tappy UI's. Every language used in serious force for web development (some languages are not, believe what you may) features one of these mega frameworks in which other web devs have encoded their knowledge and best practices around building web applications. In Python one has Django, and in Ruby one has Rails. PHP has Yii, and I hear that PHP is an entirely adequate(ly) object-oriented programming language these days, so who knows maybe that's a thing the budding webdev might consider using, except for how don't.
Finally, there is the wasteland of "native application development". Once upon a time this was just "software development" after the phrase was bastardized to mean "software running on consumer desktops" and not "missile control systems", but the poor notion's been degraded even further to now signify any drawing of buttons on a screen by any old monkey anywhere. It's not like she cares about the degradation, she's just happy to be with someone who can pay for dinner and maybe a kid. "You're a software developer too! Don't listen to what those mean guys on the internet say, I love you and that's all that matters." Not that any of us could afford to educate a kid in this America, but fuck I digress again.
"Native App Dev" is a glorious term for reading through Apple and Google's documentation for how to build list views and swiping image carousels on iOS and Android and then copying code from Stack Overflow that you can sorta-kinda beat into doing what the designer on staff has demanded, all naïve of the actual engineering constraints in play. Building UIs in this fashion is mostly configuration, if you can keep your designers on a tight leash and force them to design things that fit into the (admittedly inane) touch paradigms of the two platforms.
Backend systems are the refuge of those inadequate for the task of building end-user interfaces, and of people who recoil from the notion of building user interfaces (due to aforementioned insanity in technology choices). "Nope", one guy demonstratedly capable of building iOS and Android systems told me one time, "I simply will not build mobile applications." One wonders "why" for approximately five minutes, and then realizes that a high-performing backend dev in the SV-driven market pulls down just about as much as your typical high-performing frontend dev -- and without all of the insanity of user interface development (the "technology" choices are bad enough, but have you ever met a self-styled user-experience visionary?). This odd confluence of the derpiest and some of the sharper knives (in the sense that they have a stiff internal resistance to dumb shit, and a nose for finding it) breeds such monstrosities as Rails (grasshopper, explaining the ways in which Rails development wears on my will to live is beyond the scope of this piece, but let it suffice to say that I recently saw the following line of code and recoiled in horror: `authenticates-with-sorcery!'). Most appealingly, you get to work with sane-ish linuxes, and your systems remain stable while the suckers working in the browser and on iOS and Android chase every single operating system release with if not actual slavering excitement then full-bore Stockholm syndrome.
However, despair not. There are other "kinds" of "programming"! You could configure SharePoint websites for any of a million small/medium business with SharePoint websites. This will also make you a Microsoft stoolie, and unfit for civilized company. Also you'll be stuck with the same problems. SalesForce idem, although I don't think you have to buy the Microsoft party line to hack on their shit. You could get a plain-Jane analyst job, and then apply your not-insignificant thinker to solving business problems with technology, and nobody would tell you which languages you had to use. Shit, I know options houses with Excel spreadsheets that take an hour to run, and that's after hand-optimization.
Or you could move into the "embedded" space, and compile programs in whatever language to run on tiny devices in the "Internet of Things". Making software to run on smaller and smaller chips is an excellent career bet for the next decade, as corporate focus moves from the "one person, one device, one chip" model to a rat-king of devices infiltrating every aspect of American life.
If you're really ambitious, you could even shoot for an actual auto-didact's "Computer Science" degree. Just keep in mind that if you do this correctly, you'll not "learn programming" nearly as much as you'll learn about theories of computation. Two litmus tests: if you write a single line of code in your first semester, you got scammed; and if you don't know the lambda calculus and its combinators by the end of the third semester, idem.
The point that I apparently need at least one more glass of wine to get out is that you'll need to pick a project to "learn programming" in the context of. If you know statistics, pick up a book on what's called "Machine Learning", but is really just applied statistics, and start working through the exercises. If you've already got the hang of physics and the related mathematics, consider writing a simple physics simulator. If you want to see the results of your work, consider building a simulation of agents controlling physical objects in a simulator someone else built, like in the game development program Unity.
If you have utterly no fucking idea where to start, consider buying a Raspberry Pi and beating it into a robotic cat feeder. You might not ever finish the project, but you'll either learn how to learn how to work with the linux shitshow or you'll flunk out of computer-related auto-didactery 101.
The point is that one cannot "learn programming" quite so simply as asking "how do I learn programming". Rather, you must have a place you're trying to get to, whether that's making more money than you otherwise might or solving problems that are just too tedious to solve properly in Excel.
In any case, you really shouldn't "learn programming". The world doesn't need more programmers, and you don't need to wreck your head on the shoals of Von Neumann and his band of merry mongols. If you insist, though, I'll be here. Don't expect me to help, though, this ain't Stack Overflow.
- You either go batshit nuts and move to a cave, hook up with crypto-terrorists like The Bitcoin Foundation, or retreat into the soft embrace of Dunning-Krugerism telling yourself that "this is fine, man", surrounding yourself with other middling schmucks all the while merrily self-deluding yourselves into the kind of complacency that makes crushing a six pack in front of The Game so damned seductive for the typical American male. [↩]
- The joke's on you if you think you're "just going to learn Swift...". There are a few decades of internal Apple API's that'll need rewriting before you don't accidentally absorb some Objective-C on your way to Swift "mastery". The joke's on me if you write your app in Objective-C and then come crying for help. [↩]
- Pop quiz: which of this paragraphs parenthetical appellations did I make up? [↩]
- Another of those hoary old languages that refuse to die because they're just so miserably adequate for the tasks to which humans put them. [↩]
- True story: the extremely large iPhones exist to capture the bariatric market. [↩]
- Now that you mention it there are
- It's so bad that I actually got into the habit of preemptively quitting Xcode because its auto-completion backend would crash and never notify me. The easy and humiliating solution was simply to restart Xcode whenever it failed to complete symbols I was typing. And yes, before you ask, I'd quit Xcode regularly expecting the autocompletion framework to work again on reboot only to discover that I'd mistyped the symbol prelude. Such is the cost of shitty tooling. [↩]