@Thenomain said:
The stereotypical young person sees the stereotypical old person as unwilling to change because new things are scary. The stereotypical old person sees the stereotypical young person as ignorant and with no regard to things that are that way for a reason.
They're both right.
Keeping things the way they are is correct because we understand how it works.
Ignoring the way things are is correct because it's the only way to understand how new things work.
I have this amusing conversation all the time in my circles (embedded systems). Newcomers to the field, especially those who have come to it from other areas of coding (like the wasteful pigs that make web apps) are utterly shocked when they find out the most common processors in embedded space (by far) are four-bit processors.
Yes, even in a world that has 32-bit MCUs like the STM32F030F4P6, a TSSOP20 chip that contains within it a full-blown 32-bit ARM Cortex-M0 core (for $0.33 retail singly), engineers will reach for a 4-bit chip more often. Why?
Because they're easy to understand.
It's that simple. When you're building hardware that has to run for months to years on end without stopping, without failing, you want something that's simple and predictable, not something so big and complicated that no single human being can know all of its intricacies. You can't just plug-and-pray in this realm. And 4-bit processors are so dirt-simple that most interested electrical engineers have probably made one for fun from macro-scale TTL parts.
8-bit microcontrollers are next in line. There's about one of those used for ever three or four 4-bit ones in a system somewhere.
32-bit MCUs are used only in the biggest, most complicated embedded packages, and then they're used mainly for very high-level control. There's probably a simpler 32-bit MCU (Cortex-M0 or equivalent) at the core of a device for every 10 8-bit ones out there. Nobody will use, say, an STM32F7 line chip in a heart monitor, for example, because there's absolutely no way you'd be able to guarantee that it will function in all possible circumstances. (I'm not sure it's even possible for a single person to know every piece of the STM32F7 chips.) You might be able to find someone who knows all of a Cortex-M0 (like the STM32F0 line) core if they've dedicated a rather enormous chunk of their brain to it. (The M0 itself has about 12,000 transistors which is amazingly simple and small for a 32-bit processor, but the peripherals will kill.)
Sometimes the old ways are, in fact, the best. Not because they're the best in functionality, but because they're simple enough we can actually comprehend how they'll work.
I think anyone who clings to either one of these ideas is misguided, because we need both order and chaos, tradition and innovation. Whether or not people realize it, tradition is the foundation from which innovation grows.
And sometimes innovation needs to grow elsewhere. I laugh, long and hard, at the "innovation" of the so-called "Internet of Things" that's bringing large, bloated, horrendously complicated and ill-defined software stacks (like Node.js) into the embedded space. And I'm getting the popcorn ready for when the disasters start to strike.