A "bus" is short for omnibus. Omnibus is Latin meaning, "for all". Which is exactly what a bus is -- a way for ALL the devices to talk to each other. Think of it as public transportation for your computer chips or peripherals.
There is no such thing as a smart home: there's a lot of devices that you can interconnect that makes your home more interactive. But Google, Apple and Amazon are playing it from different directions.
Apple is coming at it leveraging their entertainment and home angle -- but with low feature integration, the worst voice assistant and thus lower device count support.
Google is attacking it like an OpenSource tech problem -- while acquiring their own device makers (Nest), and one of the best tech assistants
Amazon is attacking it like a commerce company with a shopping focus -- while acquiring their own device makers (Ring).
Microsoft's Cortana and solutions aren't even worth tracking by many. They haven't figured out this space any better than their mobile phone story -- so while they have a competent agent, and some integration technology, only a few people seem to care as the phone and device angle is being won by the other companies.
TL:DR; Google/Amazon have a home security story. Apple is falling behind. Microsoft who?
During the 80s and 90s there was a Computer Chip design war about RISC or CISC. What does that mean, and which is better? For a while, Intel (and AMD) were able to spend more on design and delay the inevitable, but once mobility (with performance per watt) became important, ARM and RISC designs have taken over for x86's older CISC design.
Buyers can get befuddled by all the terms and jargon when they are looking at buying a printer; but it really isn't hard. There are only a few basic technologies that are popular right now in printers: Laser Printers, Ink-Jet Printers, and impact printers.
I work with both Google and Apple, and so I get to use both their hardware -- and I'm an opinionated agnostic: I just want things that work well (for me). So the other day I got a Pixelbook on an incredible discount, which Google had someone deliver to my house (same day, with 30 minutes of setup help): an incredible customer experience. As a long time techie, I'm a bit of a power user, and while I've only had the Pixelbook for a little while (iOS for 10 years), my quick assessment is the Pixelbook is better than an iPad but worse than a MacBook, with a ton of caveats. The iPad rules in App selection, single app workflows, as well as consistency with iOS and integration with Apple's other devices and ecosystem. Since I have an iPhone and MacBook Pro, I'll still use the iPad more often. But as soon as you want to work in multi-app workflows, use the keyboard/trackpad, browse or use it as a laptop replacement, the Pixelbook dominates. But if you really want to get work done -- then my MacBook Pro is still my go to device.
Mutliprocessing is the concept of using more than one processor (at the same time) to help you accomplish tasks. We sort of capped out (or the rate of improvement dramatically slowed) for single-processing, but the idea of getting computers to do more than one thing at once is still growing exponentially. So let's look at what this means.
Temporary Memory is often called RAM (a kind of temporary memory). A computers memory works a little like a persons memory. We have two types of memory; short term and long term, or in computerese this is called temporary and permanent storage (memory).
Computers have a few types of memory; temporary (short-term) or permanent (long-term). RAM is a type of short term memory, or where computers remember stuff that it is working on at that moment; but if the computer loses power, it forgets everything in short-term memory. So to help with that, we have permanent storage (memory). Long term memory (or permanent memory) is for things that we want to remember for long periods of time or even if you lose power. In computers, we often do this by "saving" (writing) chunks of memory (files, pages or programs) to some device that is meant to provide our permanent storage. Later we can "READ" (get or open) those files/programs from this long-term memory. In mobile devices it's usually saving constantly, to hide the saving files from the user.
MHz or GHz, what does it mean? Many people ask questions about Megahertz (MHz) or Gigahertz (GHz). Worse, some people don't ask, and assume they know what those terms mean. It isn't hard to understand (it's just clock speed) - but it isn't as easy as some assume either. Many people get it wrong, and assume that MHz (or GHz) is how much work a computer can get done: but it's more like RPM's in a car than MPH, it can mean the car is working harder, but not necessarily that you're going faster.
The iPod is a little device that is much bigger than people realize. For a complete technology nerd, I'm fairly socially aware and can actually interact with humans as well as hardware. But when something new and cool comes out, my technaholic tendancies tend to come bubbling to the surface. The other day I succumbed to these urges, and bought myself a new toy: an iPod. And it has been revolutionary in its simplicity.
What is Digital? Why not just use analog? A lot of people hear and use the term "digital", but do not really know what it means. I also occasionally get the question, "why is digital better than analog?" In humans, digital means digits (fingers), in clocks it means that the clock shows the actual digits (numbers) and not a hand that points to the numbers. In computers, digital is another way of saying "binary". That means that there are really only two states; either on or off, either power signal high or low, or the value is either zero or one -- those all mean the same thing. Where analog has a lot of values between zero and one. So why is less choices better? That takes some explaining.
Forget what everyone tells you about buying a computer, I'm here to give you good advice. These are some hints as to how you should buy a computer; when should you move up the line (higher performance), or down the line (lower cost), and why? Where are the best values? Where should you spend your money? These are all just my opinions on what makes sense, but they're pretty objective ones.
Bits of bits... how many bits should my computer be and why should I care? If 16 is good, then 32 must be twice as good. And if 32 is good, then 64 has to be great. The truths of engineering aren't that clear. If you need 64 bits, it's great.... but much of the time, you need much more, or less.
There's a lot of whining and complaints about the New MacBook Pro's (MBP2016). Some valid, many overstated but heartfelt. But I think the problem was more about messaging than delivery. For me, and most users, it's a great product.
A better iPad. It 's better in every way -- bigger, better, faster... and more expensive. OK, better in all ways but price. My Big-iPhone (7 Plus) meant that I was using my iPad's less -- but give me a bigger screen and keyboard that works, and I find that the iPad fills a niche for me, as a great travel/note-taking and entertainment device, when it's not worth bringing out my laptop. And with an App, it works as a second screen for my laptop when I do real work.