Technology

From iGeek
Revision as of 13:26, 26 February 2019 by Ari (talk | contribs)
Jump to: navigation, search
Technology.png

I used to be an industry tech writer and blogger.

Due to my current job (and knowing too much about what's going on, and various non-disclosure agreements as well as diplomatic decorum), this is one of the more constrained areas for me. So while I'd like to write more, I'm restricted to very generic topics. Or just share articles from decades ago.



AmazonLogoCube.png
I'm neither advocate nor foe of Amazon. They are a company, that's doing their best to adapt to changing market demands. Some things they do, like offering me better selection at lower prices, is great. Other things they do, like censorship or partisan politics, is annoying. A company isn't made up of any one act or person, but the aggregate of all of them. This article is a list of different Amazon related topics/articles that touch on the Amazon Gestalt. While I have individual opinions on individual acts, I'm perfectly fine leaving it to readers to make up their own minds on those micro-acts, or the macro-organization and personalities.
Apple logo.png
A list of various articles and topics of discussion around Apple. Since they're a secretive company, I tend to avoid opining on a lot of things about them, out of respect for their desire and right to control their own messaging. So I tend to only focus on the trivial for a reason.
Belkin.jpeg
Belkin International, is the parent company for Belkin, Linksys and Wemo: American manufacturers of consumer electronics that specializes in connectivity devices including routers, iPod and iPhone accessories, mobile computing accessories, surge protectors, network switches, hubs, (USB and computer network) cables, KVM switches, racks and enclosures, and other peripherals.
FacebookHypocrisy.jpg
Facebook is 3 things: bad interface, bad management, and biased policies. I want a social network that gives me control of what I see and share -- both to my friends and to advertisers. I realize they need to make a buck, and my information is their product, but the point is you can still give users the illusions of control. But Zuckerberg seems to have falling into the egocentric pit that many young billionaires do, they think because they timed things well, and worked hard, and got lucky that they're smarter than everyone else. This makes them arrogant, less mature, and slower to grow than the average human: Dunning-Kruger, inflated by being surrounded by yes-men.
GoogleEvil.png
In 1995, two 20-something Ph.D. students from Stanford were looking for something to do their dissertations on, and decided that they should focus on a Web crawler and indexer research. Once they found funding and a revenue stream based on advertising, they became what's known in the Valley as a Unicorn: a multi-billion dollar company. And their saga from College Dormitory Culture to Corporate Cult began. Unfortunately, explosively rapid successes skip normal growth and maturing processes in corporations, and can create cults (or at least cult-like behavior). There's a line between corporate culture and conformity to the corporate line or expulsion, and that line seems to often get crossed at the Googleplex, without any of the normal checks and balances that might apply at a more moderate corporation.
Nextdoor is a Social Platform, that felt that removing anonymity, keeping it local, and better policing/community standards would make a different type of Social Media platform. While the created might have been well intentioned leftist activists, they failed to understand that wokescolds will ruin everything you give them a voice in. So it has all the maturity and depth of a Twitter, wrapped in false civility and double standards of leftist PC thought-police, moderating anything.
MissCubicle.png
A bunch of startups didn't have money to create usable facilities, and they were often hiring students who didn't know better (about what private space was) and worked out of coffee shops, so they created "Open Offices".

Planners who failed at life decided that if Google/Facebook/etc. succeeded in spite of a horrendously distracted working environment, then everyone should suffer -- and Corporate America (especially Tech) started shifting to Open Office Floorplans; to the annoyance of tech workers everywhere. This was sold as "more collaborative", but there's no worker with a triple digit IQ that actually buys that, and there have been multiple studies that bear out the skepticism: workers get more quiet to keep from disturbing others (and hide away in meeting rooms or with headphones to create faux privacy). But the one-size-fits-all is attractive to the small-of-mind, paired up with the financial folks that could increase population density, without fixing facilities for parking, loading/unloading or eating. And the results have been productivity killing, increased employee friction, increased illness/sick-time, less face-to-face interaction, and more start working from home or as remote as can get away with. This will go down as proof that companies that ignore management fads operate much better than those that follow them.
QuarkXpress-1.0.png

Quark is a company that helped revolutionize Desktop publishing. But they should be a verb for how to fuck-up your business. They went from 95% market share in desktop publishing (thought the 1980's and 1990), to 25 percent within a few years after Adobe InDesign was released. And InDesign was released with fewer features, not to mention conversion costs. Why would 3 out of 4 customers pay money and time to convert? The answer is simple: they outsources their development to India, had some of the worst support in the industry, had the most annoying copy-protection (DRM / Digital Rights Management) that made it expensive/annoying to use/maintain/upgrade their programs -- and they basically pissed their customers off, that they would have paid more to get less, just to get out from under their thumb.

Slack is an internal messaging/communication tool (1:1 and many:many) that gives users the impression that their communications/channels are private, but the truth is that are able to be monitored by IT/managers/corporate eyes. In any corporate culture that allows "open discussions", it often allows for herd think and the problems with social media: especially in leftist/snowflake culture.
TwitterInfoWars.jpg
Twitter is an enemy of free speech and tolerance. Examples include them shadow banning conservatives (and admitting it, on-tape), some of their employees getting excited about violating their members privacy (assuming those members are conservatives/Trump), and how they suppressed anti-Hillary tweets during the election. That's scarily Orwellian. It's still their company and they get to be as dicky as they want to be with it (within the bounds of the law). But I'm going to point out their moral terpitude just so that consumers can make an informed choice -- not as any call to action (legal, governmental or otherwise).
Wink Logo.svg
Wink is an American brand of software and hardware products that does smart home devices. During the middle of a COVID pandemic they decided to shift from paid to subscription only model with 7 days notice to customers -- meaning if customers don't pay the extortion then their products become useless. While they were going to piss off users either way, the way they handled was guaranteed to cause maximum irritation and alienation
YouTubeLogo.jpg
YouTube (a division of Google) has a specially abusive place when it comes to the world of selective censorship - that only seems to apply to truths liberals hate to hear.
  • PragerU's was suppressed/censored (silently), with no evidence offered that anything they have said it wrong, untrue, or racist. They do expose misleading beliefs of the far left, so that appears reasonable to block or punish them.
  • YouTube went on a crusade against guns, first you couldn't sell guns, then promote guns, and so on. They terminated gun parts channels, like Brownells. They're inventing laws and changing terms that are against the spirit of our constitution.
  • As part of an NYT Expose by Project Veritas (James O'keefe), they caught the NYT editor Nick Dudich explaining how he was using friendships and coordination with YouTube (Earnest Pettie) to manipulate social media to intentionally influence the news. YouTube was being a tool of evil, to work against a free election.

Every company has a right to decide who they support or not. But the problem is Google/YouTube PRETENDS to be an open platform (and community service). Yet, they're not doing what they advertise. If they openly admitted in their policies that they're a left-of-center advocacy site that will censor center/right positions at will, then at least that would be honest.




WellRunCompany.jpeg
As an oversimplification, balancing opposing forces in Sales/Marketing, Finance/Operations and Engineering is key to having a well run Tech Company. Throw in some other difficulties like good communication, good focus, and reducing politics, and things will hum along smoothly. But it's like trying to keep jugglers riding unicycles-sticks on a slack-line: while the theory is easy, the continuous shifting makes the real-life implementation hard.
Beta-logo.png
Alpha, Beta, Gamma, Delta... what is a beta version anyways? Many companies talk about Beta version, or "going Beta". Many non software people, or even many managers at software companies, don't even know what that means. The short version is just letters of the Greek Alphabet: Alpha (1st) was generally for in house testing, Beta (2nd) was for outside testers, and Golden Masters were versions burned onto prototype CD's (that were gold in color) and sent to places that would manufacture many CDs/DVDs from.
Internet-bubble.gif

People on both sides of the dot-com bubble can be wrong at the same time. The Internet IS changing the global economy, and this IS NOT just a "flash-in-the-pan" fad that's going to be gone tomorrow. But the other side is that investors CAN overreact to hype, get way ahead of returns, and that can cause a big pullback. But in the long term, this is the new normal.

FUD.png

There is a computer term that you hear some geeks and industry insiders use, but many people new to computers don't know, but should. That term is FUD. FUD means "Fear, Uncertainty and Doubt", and it was the tool of large companies to scare users from using small companies software (or hardware). They'd so uncertainty, so customers would buy from the safest (largest) company, even if it wasn't currently the best software, or scare them into buying the biggest program, over features they might someday need (but only added complexity today).

Girlgeek.png
Why are there so few female geeks? Sexism is a part of life.

Some who don't know me might call me a sexist pig. Not because I think one gender is better than the other, I just think everything in life is about tradeoffs. Genders are not better or worse, but there are differences. I don't just mean different as in input vs. output, or physical differences, I mean that we are fundamentally different in how we behave, what motivates us, how we think, as well as how our environment effects (and changes) us. This helps explain why there are so few Female Geeks.

CSICyber.jpeg
Intrusion and prevention is nothing like the movies. Think months to deliver an attack, to get through layers of defenses. And most counter-hacking is computer forensics to figure out what they got, days or weeks after they're gone: following log trails, or decoding some payload. If you know they're there, they can block you -- and they can usually only figure out someone was there, long after they're gone.
Interesting.png

These article were first written as BBS posts, in the 1980's or in the 1990's computer forums. Microcomputers (you probably know as desktops) displaced "Big Iron" and dumb terminal, but these battles and questions had been raging for decades back then. Did people want all their data on them, or just be able to access it from anywhere (and any device). And the answer was, and still is, "yes", both please.

InformationAgeChanges.jpg

Information Age Changes: Information is power, and it's in the hands of the unwashed masses!

In the past, the media, the press, or whatever you choose to call them, had a lot of power over people. They controlled the information, and that alters people's perspectives and their entire lives. People fail to realize how significant this power is. Now it is being wrested from their control and put in the hands of the common man. Will anarchy result?

InfoAgeCopyright.jpg
The information age is disruptive, and one of the most disrupted areas is copyrights (intellectual property). Economic wars are being fought over who owns what, and for how long, and it's breaking whole markets. The Music industry sort of collapsed and is scrambling because used music is as good as the original. The same with books, articles, comics, and anything creative. An information economy is a pirate economy... the sellers of content (whether that's audio, video, written), no longer have a monopoly on distribution, because it is so easy to get it from other people. And there becomes conflict between buying something new, and giving the author their piece, or buying something used and getting the best deal -- but cutting the author out.
InfoAge2.png

Alan Kay thought up the idea of the DynaBook in 1968 (which later became laptops, or tablets), by listening to those around him, predicting the same things. History and progress is happening in slow motion. It only seems fast, because we're moving slower.

Interesting.png

These article were first written as BBS posts, in the 1980's or in the 1990's computer forums. Microcomputers (you probably know as desktops) displaced "Big Iron" and dumb terminal, but these battles and questions had been raging for decades back then. Did people want all their data on them, or just be able to access it from anywhere (and any device). And the answer was, and still is, "yes", both please.

LogsJournals.png

Journaling is the art of creating logs of things that you're doing, such that you can reverse or recreate things, using the logs. I joke it's, "Dear Diary, now I'm going to make a change..."

Basically it is like being followed around by the FBI or a secretary, and have them writing down every little thing you do, every day. A minute by minute diary of events. While we might think of this as an annoying invasion of privacy, it could occasionally come in handy during a lawsuit or something where you want to know exactly what you were doing when something bad happened, or if you lost something and you needed to retrace your steps.

Luddites.jpg
Ahh Legacy, being haunted by your own past. Sometimes great, always painful, a products legacy can be both its greatest strength (name recognition, customers, trained users, and so on), and it can be it's greatest weakness (cruft, resistance to change, millstone to drag around, etc). While it is great to have a customer base paying for your R&D, it can also suck when you're trying to adapt to a changing market or changing technology: it can suffocate companies and products under its own weight.
Internet.jpg

Government/ARPA research gave us the Internet the same way they invented the car or airplane. By 1976 (founding of ARPA) we had hundreds of computers networked, by 1993 the Internet only carried 1% of the information traffic (and we had plenty of traffic). But by 2007 that had flipped and most traffic was TCP/IP based, because it was free, standard and good enough. However, without TCP/IP, one of the other protocols would have become a standard, and we’d still have had everything we have today (in some areas, more). The government gave us nothing that we didn’t already have (or wouldn’t have). Politicians (as usual) took credit for other people’s work.

Spam can.png
My Aunt asked a question the other day about all those unsolicited emails she gets. Emails telling her she can "Make money while she sleeps...", asking her if should would like to earn an extra "$10,000 a month, part time...", or telling her where she can "buy herbal viagra on-line...". She asked, "What are these emails, and how does she stop them?".

I told her they were "spam".

She asked, "what's spam and where does the term come from?" In my usual geek style -- ask a simple question and get a four page answer that may eventually stumbles into the point.
FireHose.jpeg

Speed is relative. What kind of speed are you talking about? There are a few aspects to speed on a computer. People start looking at benchmarks, or hearing numbers, and they don't always understand what they mean -- or what changes will mean to them. For example if a machine is twice as fast as another, why doesn't it take half as long to do something? Well, mainly because computers are complex systems, and mostly because twice as fast at one thing is not twice as fast overall. Lets start with the basics, the difference between throughput and latency.

Wetware.png

The other day, I got the pleasure of speaking with a bunch of teens at a career center, on a "career day", about what it is that I do, and why I do it (at the time, I was Director of New Media for a media conglomerate.... fancy speak for I helped Newspapers get online and make money doing so). Career day was a lot of fun. One of the things I was asked about is what is holding technology back the most. The answer was easy, "Wetware (People)".

Wetware is slang for the human brain/computation. You have hardware, software, and human cognition (Wetware). Well, wetware is need for the other two, for now... but it is also holding us back. People are the biggest barrier to technology, innovation and adoption. Computers are incredibly simple devices - as one kid asked, "aren't they all just 0's and 1's"? And they are. What makes them more complex is how many 0's and 1's there are, and how we put them together. Computers are doubling in capabilities every few years, the reason progress is much slower than that, is people's ability (or desire) to grow, is not up to that challenge.


Programming


WhatsAntiAliasing.jpeg
WhatsAntiAliasing.jpeg
Anti-aliasing is a technique in computers for sacrificing clarity/contrast to give the illusion of better screen resolution.

What is Anti-Aliasing? It is using color to increase the percieved resolution on a display. It can't really alter resolution, but it can appear to do it, to the eye.

BASIC.png

Because I was a programmer, many people ask me, "How do I get started programming?" There are many choices, and it really depends on what you are trying to do. There is programming Applications, scripting, Web Programming, and so on. The bad news is that each of those choices will alter which language or tools you should choose -- and most people don't know this in advance. The good news is that when you get the concepts, many of them can follow from language to language and tool to tool. So the most important thing is to just show no fear, dive in, and start learning; some knowledge will be throw-away, but most you'll carry with you for years to come.

Gulliverstravels.jpg

What is Endian? How do you like your eggs? Big or little end up? If there are two equally valid ways to do something, then odds are that two different companies will chose to do those things differently. This is Murphy's law in action -- and it applied to different chip designers and how they ordered data in memory.

CountingComputerese.png

Counting in Computerese: The Magic of Binary, Octal and Hexadecimal. Computers deal in the mystical numbering systems, like Hexadecimal, Octal and Binary. People get concerned over it sounding complex, but they are really quite simple. If you can read this article, you should have a really good understanding of what they are, and how they work.

Sandwich.png
The other day, a friend and I were discussing the command-lines versus a GUI. His point, which I've heard a thousand times before and for 20+ years, is that unless you understand what's going on in the command line, then you don't understand what's going on in the computer. The idea being that the GUI abstracts you from what's really happening, and that some people can make command lines really fly; so they must be better overall. There's really a lot of different arguments and bias in there; so I have to break them down.
Database.png

What is a database? What are the kinds of databases? Why do you care? A Database is just a place that is used to so store and organize data (information). Your address book, a spreadsheet, basically anything that has a lot of similar (like) elements, is the basics of a database. In computers, they're used all over the place. And there are so many different ways of organizing that data, that we invent a whole lingo franca around database terminology. This tries to demystify some of the basic terms.

DigitizedSound.gif

Digitized Sound: understanding samples, rates and digital audio is really pretty simple. Sound is nothing but pressure waves traveling through the air, and hitting your ear -- which your brain decodes as sound (from noise or music). Computers have two basic ways of recreating sound, one is Synthesized Sound (make a waveform and tone that matches the original), and the other is to digitize the sound (sample the pressure wave very quickly, and then recreate it later). This is on how sampling is done.

Enterprise.jpg

Enterprise, Opensource or Commercial tools, which is better and why? Of course the answer is, "it depends": different tools are better for different things. Now I know that doesn't sound revolutionary, but that does seem to perplex some people. People don't understand the different tools or market segments they fit into, or what they are good for.

FUD.png

There is a computer term that you hear some geeks and industry insiders use, but many people new to computers don't know, but should. That term is FUD. FUD means "Fear, Uncertainty and Doubt", and it was the tool of large companies to scare users from using small companies software (or hardware). They'd so uncertainty, so customers would buy from the safest (largest) company, even if it wasn't currently the best software, or scare them into buying the biggest program, over features they might someday need (but only added complexity today).

ForwardCompatibility.png
In the tech world, I hear all the time about "Backwards" compatibility. That to me is like saying "I wish my hot new CD player would play my 8-track tapes as well". Backwards compatibility is when you create a new function or feature for your computer, but must also have a mode that works just like it did in the past. But this article is about learning to look forward,
FreeFeature.png

A free feature in software, is like a free lunch: there's no such thing as a free lunch. The value of something is directly related to how much it does what you need, and how much it doesn't try to do stuff you don't need. Most "free" features, are things that programmers had to add for something else, or are mostly implemented because of something else, so they figure, "hey, it's free" and just release it -- which makes it a distraction, a support cost, a potential bug magnet, and something at least a few customers will learn to use -- even if there's a better way to do it, thus giving you a legacy nightmare.

The idea of a "free feature" is proof that engineers shouldn't pretend to be product managers (and vise versa).
HackCrackPhreak.png
What are hackers, crackers and phreaks? This is the basics of how the terms evolved. They don't really mean what they meant when they were first used. So people date themselves with how they use the terms.
Hiring-programmers.jpg
Human Resources people, Managers, and general users, have no idea how simple or complex computer programming is. They think that they can just throw programmers around from one task to another, then some HR people select computer programmers based on language (Syntax), and not what really matters (skills and abilities). This would be like hiring an employee based on what school they attended and not what subjects they studied! This article will give some non-programmers a better idea of what Programming is about, and what they should be looking for when hiring programmers.
Hypercard.png
The History of Visual Basic is a bit of a history of early computers and Microsoft, how they borrowed other people's ideas and even implementation and then took credit.
Compression.jpeg
How does software Compression work? How do you make something smaller? In some ways you don't -- you just encode the data in a more efficient way. This article explains some basic concepts about data storage and compression: which are far simpler than people realize (it's just the implementation that can get some hairy math). Basically, compression is achieved by looking for commonality (in shape or pattern), and then saving that smaller description rather than saving the whole image. If you don't get it, read on, and it may become more clear.
Frequency.png
MHz or GHz, what does it mean? Many people ask questions about Megahertz (MHz) or Gigahertz (GHz). Worse, some people don't ask, and assume they know what those terms mean. It isn't hard to understand (it's just clock speed) - but it isn't as easy as some assume either. Many people get it wrong, and assume that MHz (or GHz) is how much work a computer can get done: but it's more like RPM's in a car than MPH, it can mean the car is working harder, but not necessarily that you're going faster.
Programming.jpg
What's the difference between a programmer or a software engineer? Trick question. Since "Software Engineer" (or not "Computer Scientist") has more prestige, virtually all programmers (software developers) call themselves by fancier titles. But in truth, there used to be a difference... since the age of agile programming (rapid application development), there are scant few software engineers. Unless you're doing genetic algorithms (or Artificial Intelligence/Machine Learning) and running experiments on your code to see what it does, you're not really a Computer Scientist. But I'll cover what it used to mean.
RISCvCISC.png
During the 80s and 90s there was a Computer Chip design war about RISC or CISC. What does that mean, and which is better? For a while, Intel (and AMD) were able to spend more on design and delay the inevitable, but once mobility (with performance per watt) became important, ARM and RISC designs have taken over for x86's older CISC design.
ImagesRaster.jpeg
Sometimes you'll hear people say, "rasterized image" (often as opposed to vector images), but what exactly does that mean? Here's the very basics of pictures or rasterized images (and a little bit about compression).
There's a joke amongst programmers, usually towards their management, "if you don't leave me alone, I'll replace you with a very small shell script". The idea is that they're so simplistic and repetitive that a few lines of code could do their job: now go away.
SoftwareConsultants.jpeg

I worked over a decade as a consultant, and used and managed them for a couple decades more. As a consultant I've worked for organizations (agencies) and as an independent. I have nothing against consultants or consulting (they're a very valuable resource), but there is an art to using consultants or consulting organizations wisely, and most companies don't quite have the artistry required. This article will try to explain some of the pitfalls and ways to use consultants better.

SDLC.jpeg
There's a lot of variants of a Software Development Live Cycle. And the methodology has changed over the years... but mostly that about terminology or details, not the basics which seem to remain fairly constant, because physics, human nature, resource management, and the concepts behind writing software itself don't really change. So while companies (and some people) like to claim they're adherents to some methodology, if they are, they're idiots. The truth is every methodology adapts to the team/company, or that reflects the managements lack of a clue (or flexibility).
SynthesizedSound.gif
Synthesized Sound is just making waves. Computers have two basic ways of recreating sound, one way is Digitized Sound (sample it, and then play it back later), the other is to synthesize it (make a waveform that approximates what you want) -- think of it like taking a picture versus sketching/drawing and painting. Synthesizing is the latter, creating pressure waves by algorithm, rather than recording it.
I'm both a big UNIX fan, and one of its detractors. UNIX is the old war-bird of Operating Systems -- which is ironic since it really isn't an Operating System any more -- but more on that later. UNIX was created as a private research project by AT&T's Bell Laboratories (Ken Thompson and Dennis Ritchie) in 1969. Since AT&T wasn't really in the computer business they had the wise marketing plan of giving away the source code to UNIX for free. UNIX wallowed around and was basically only popular in education and research labs because it was inferior to other commercial OS's of the time. But since Universities could modify this Operating System freely, many programmers cut their teeth (in school) using it, and researchers came from academia so they used it too. This legacy has totally defined what UNIX is, and what it is good for -- and bad at. It wasn't good, it was free. But perfect is the enemy of good enough, and UNIX was always "Good enough" that people used it, added to it, and it sort of became a defecto solution until it got near universal adoption. These same pragmatic compromises are similar to why TCP/IP and HTML became the Internet.
MP3 logo.png
What is MP3? It's just a compressed file format used for sound. Video requires a lot of bandwidth, so they use fancy algorithms to compress it (make it as small as possible). Since video tracks also have sound with them, they did an exceptionally good job on sound as well. Thus, for doing sound compression we just use the video compression fromat's sound compression (Motion Picture Expert Group format for audio), called MPEG Audio Layer III, ala MP3.
WebApp.png

What is a Web Application, and how does it vary from a traditional website? There's a joke in tech, "there is no cloud: there's just somebody else's computer": in other words, you're either using your machine, or someone else's. A traditional website is just you browsing some files (in the HTML format) on someone else's computer. And a Web Application is for things more complex than just reading files: the other computer has to be running an Application (remotely) to serve up some of the stuff you're asking for: like if you need to enter forms and have that do something, do complex lookups (searches of files), or basically do anything more complex than read/write information but interact it with in a more complex way. That's what Web Apps are for.

WhySoBuggy.jpeg
Why are programs so buggy? They're not bugs, they're features... sorry, that's an old programmer joke. Everyone has problems with their programs (software), it crashes, stalls, or does unexpected things. People ask about these "bugs", why are there so many, and what can they do about it. Hopefully this helps you understand why.




Crawlers.png

The terms bots, crawlers and spiders are likely to give arachnophobes the heebie-jeebies, but they're really just an important part of the way search engines work. The automated critters just go to the front page of a website, and look at every link in that page... then go to each of those and do the same, and so on. Once they've visited every page on a site (crawled across it), they have a pretty good idea by counting most frequent words, of what each article is about, as well as by looking at who links to that page/website, how it should rank in importance.

Cookies.jpg

If you've heard Internet slang thrown around, you probably heard someone reference "cookies". What are cookies, how do they work, and what do they mean for your privacy? This article covers the basics.

Email.png
The Origins, history and evolution of eMaill, forums and live chat: basically we wanted ways to send things to others, and others wanted ways to not be interrupted and read at their convenience.
Subnets.png
What is Network casting and subnets? Networks are ways to break up information into smaller chunks (packets) and then send them over a shared line or radio frequency, to other devices, where the parts can be rebuilt into the whole again. Casting and Subnets are ways to send to many people at once, but not everyone.
Distrust.png
Think of the internet as "the net of 1,000 lies". This is a bastion of free speech. But never forget that free does not always mean "correct"! Sometimes you get what you pay for. So trust, but verify. Heck, that's good enough advice to use it every day -- the same applies to teachers in school, and textbooks, certainly politicians, and so on. In the end, skepticism is Science and Critical thinking -- it doesn't matter what the consensus says, or any one article, it matters what you can prove to be true or false. Trust at your own peril.
WWWIcon.png
Have you ever wondered how the Web works? The majority of the Internet and computers are actually very simple to understand. The jargon and alphabet soup (acronyms) only make it sound more mysterious and complex than it really is. This article covers the basics of what happens when you go to a website.
WebSearch.png
The basics of searching the web, or how to use Google better. Unfortunately, Web Searching is still not very good -- partly due to complexities of language, mostly due to poor implementations. But sadly, since Google and others have done a really poor job of adapting to you, you're going to have to learn how to adapt to them. Thus one of the most important things in a users "Web Experience" is learning how to search the Internet. Most users haven't spent more than 2 minutes learning how to search, and never even clicked the "advanced" search link on their favorite search engines. No wonder so many users are frustrated because they can't find anything, or find 10,000 things they don't care about.
DNSIcon.png
How the Domain Names System (DNS) works. Networks only understands addresses, and humans want a name to talk to another machine (or website) by name (like www.iGeek.com). DNS is a server that translates what you type by name (iGeek), into a sort of phone number (IP address) that the network can understand. That is all.
WebApp.png

What is a Web Application, and how does it vary from a traditional website? There's a joke in tech, "there is no cloud: there's just somebody else's computer": in other words, you're either using your machine, or someone else's. A traditional website is just you browsing some files (in the HTML format) on someone else's computer. And a Web Application is for things more complex than just reading files: the other computer has to be running an Application (remotely) to serve up some of the stuff you're asking for: like if you need to enter forms and have that do something, do complex lookups (searches of files), or basically do anything more complex than read/write information but interact it with in a more complex way. That's what Web Apps are for.


As a kid, I explored the dark side of hacking, cracking and phreaking... and as an adult, I challenged myself to get a CISSP : a 5 hour test/certification on computer security. While cramming with many professionals, it was nice to find that the top 10-20% in any one domain, knew more than me in that area, but I had the widest breadth of knowledge in the room.


Piracy.png

Cracking is the black art of removing copy protection from other people's programs. There are many "pirates" (people that use software without buying it) -- but far fewer crackers. Cracking requires enormous dedication and patience. It was far easier in yesteryear (systems were simpler) -- but now days there are better tools, so in some ways that makes it easier.

Cracking is often a battle of wits and patience, where the cracker removes all of the copy-protection code or figures out ways around it. At least it is a game for the cracker -- the companies that have their software cracked find it far from "fun" or "amusing".

Binaryeasteregg.jpg

What are easter eggs, and where do they come from? And I'm not talking about the physical ones in springtime, I'm talking about hidden features or credits in software.

Firewall1.png

What is a FireWall? In a structure or a car, the firewall is something that protects one area from another - usually in case of a fire. The firewall stops or at least slows the fire from spreading by being a physical barrier. In computers and networking it basically does the same thing, but the "fire" that it is trying to slow/stop is an intruder or security leak.

The network administrator turns on or configures this network barrier (firewall) between one network (or area of the network) and another. The firewall blocks everything, except for what it is configured to let through. This can seriously hamper intruders and increase security. To understand this better, let's get a little geeky.

HackCrackPhreak.png
What are hackers, crackers and phreaks? This is the basics of how the terms evolved. They don't really mean what they meant when they were first used. So people date themselves with how they use the terms.
It is not that hard to hack into a network/machine - far easier than people realize, yet far harder than the movies make it seem. It can get very complex -- but there are usually "easier" in's that the hard brute force methods. There are many levels to a break in -- from the ballsy "impersonating an employee" and just walking around a company (badges are easy to create, and don't slow most people down) -- to stealing network traffic and analyzing it. There are thousands of ways to get in, and the more complex the counter measures, the more potential holes there are (but the harder they may be to find).
CSICyber.jpeg
Intrusion and prevention is nothing like the movies. Think months to deliver an attack, to get through layers of defenses. And most counter-hacking is computer forensics to figure out what they got, days or weeks after they're gone: following log trails, or decoding some payload. If you know they're there, they can block you -- and they can usually only figure out someone was there, long after they're gone.
HowSecure.jpg
How secure are your devices from intruders? The answer is "it depends", on a lot of things, like what machine you have, what you do, and so on. The short answer is in order of safety (from least secure to most), you'd go: Windows, Unix, Mac, Android/Chrome, IOS. And the OS's are more secure than the Apps you run -- so iOS running only Apps from their Store is going to be a lot safer than a machine that's running software downloaded at random from the Internet.
Password1.jpeg
The reason there's so many annoying password requirements, is because passwords are so instrumental to security (and human nature is so predictable). That being said, most of those annoying password requirements are doing it wrong, and just annoying customers.
Are you out of your phreaking minds? Phreaking is when people (hackers) figure out how to break the phone companies security, to get access to control the phones. Most often used to make free phone calls, or get operator powers. Network hacking is breaking in (usually looking around without doing harm). Cracking is defeating copy protection in someone else's code. But there is far less moral ambiguity about phreaking -- almost all phreaks get free phone calls, and that is something the law (and the phone company) frowned upon -- seriously. The phone company dedicated resources to countering phreaking, and hunting down phreaks. It became the blackest of the black computer "arts", and through improvements in security, and commoditization of long distance phone call costs, it largely doesn't exist any more.
Privacy2.jpeg

Privacy and the web: how safe is your info?

The other night I was watching a Television show that discussed computers and privacy, and like a geek, I was getting annoyed and talking back to the show; it seems that Hollywood needs to get better technology consultants instead of terrorizing the public with misinformation and calling it entertainment. If I didn't know better, I'd be paranoid too. But I'm not, so it's more mock-worthy than helpful.

OnlineShopping.jpeg

Shopping and Physical Security: One of the areas that people are very concerned about security is on-line shopping. I think they are often focusing on the lesser threats. People can hack your online shopping -- but it's a lot of work. It's far easier to steal your information through other means.

VirusWormsTrojans.jpg

Virus, Worms and Trojans, some various hacker terms/attacks explained (simply).