What is Anti-Aliasing? It is using color to increase the percieved resolution on a display. It can't really alter resolution, but it can appear to do it, to the eye.
Because I was a programmer, many people ask me, "How do I get started programming?" There are many choices, and it really depends on what you are trying to do. There is programming Applications, scripting, Web Programming, and so on. The bad news is that each of those choices will alter which language or tools you should choose -- and most people don't know this in advance. The good news is that when you get the concepts, many of them can follow from language to language and tool to tool. So the most important thing is to just show no fear, dive in, and start learning; some knowledge will be throw-away, but most you'll carry with you for years to come.
What is Endian? How do you like your eggs? Big or little end up? If there are two equally valid ways to do something, then odds are that two different companies will chose to do those things differently. This is Murphy's law in action -- and it applied to different chip designers and how they ordered data in memory.
Counting in Computerese: The Magic of Binary, Octal and Hexadecimal. Computers deal in the mystical numbering systems, like Hexadecimal, Octal and Binary. People get concerned over it sounding complex, but they are really quite simple. If you can read this article, you should have a really good understanding of what they are, and how they work.
What is a database? What are the kinds of databases? Why do you care? A Database is just a place that is used to so store and organize data (information). Your address book, a spreadsheet, basically anything that has a lot of similar (like) elements, is the basics of a database. In computers, they're used all over the place. And there are so many different ways of organizing that data, that we invent a whole lingo franca around database terminology. This tries to demystify some of the basic terms.
Digitized Sound: understanding samples, rates and digital audio is really pretty simple. Sound is nothing but pressure waves traveling through the air, and hitting your ear -- which your brain decodes as sound (from noise or music). Computers have two basic ways of recreating sound, one is Synthesized Sound (make a waveform and tone that matches the original), and the other is to digitize the sound (sample the pressure wave very quickly, and then recreate it later). This is on how sampling is done.
Enterprise, Opensource or Commercial tools, which is better and why? Of course the answer is, "it depends": different tools are better for different things. Now I know that doesn't sound revolutionary, but that does seem to perplex some people. People don't understand the different tools or market segments they fit into, or what they are good for.
There is a computer term that you hear some geeks and industry insiders use, but many people new to computers don't know, but should. That term is FUD. FUD means "Fear, Uncertainty and Doubt", and it was the tool of large companies to scare users from using small companies software (or hardware). They'd so uncertainty, so customers would buy from the safest (largest) company, even if it wasn't currently the best software, or scare them into buying the biggest program, over features they might someday need (but only added complexity today).
A free feature in software, is like a free lunch: there's no such thing as a free lunch. The value of something is directly related to how much it does what you need, and how much it doesn't try to do stuff you don't need. Most "free" features, are things that programmers had to add for something else, or are mostly implemented because of something else, so they figure, "hey, it's free" and just release it -- which makes it a distraction, a support cost, a potential bug magnet, and something at least a few customers will learn to use -- even if there's a better way to do it, thus giving you a legacy nightmare.The idea of a "free feature" is proof that engineers shouldn't pretend to be product managers (and vise versa).
There's a joke amongst programmers, usually towards their management, "if you don't leave me alone, I'll replace you with a very small shell script". The idea is that they're so simplistic and repetitive that a few lines of code could do their job: now go away.
I worked over a decade as a consultant, and used and managed them for a couple decades more. As a consultant I've worked for organizations (agencies) and as an independent. I have nothing against consultants or consulting (they're a very valuable resource), but there is an art to using consultants or consulting organizations wisely, and most companies don't quite have the artistry required. This article will try to explain some of the pitfalls and ways to use consultants better.
Digitized Sound (sample it, and then play it back later), the other is to synthesize it (make a waveform that approximates what you want) -- think of it like taking a picture versus sketching/drawing and painting. Synthesizing is the latter, creating pressure waves by algorithm, rather than recording it.
I'm both a big UNIX fan, and one of its detractors. UNIX is the old war-bird of Operating Systems -- which is ironic since it really isn't an Operating System any more -- but more on that later. UNIX was created as a private research project by AT&T's Bell Laboratories (Ken Thompson and Dennis Ritchie) in 1969. Since AT&T wasn't really in the computer business they had the wise marketing plan of giving away the source code to UNIX for free. UNIX wallowed around and was basically only popular in education and research labs because it was inferior to other commercial OS's of the time. But since Universities could modify this Operating System freely, many programmers cut their teeth (in school) using it, and researchers came from academia so they used it too. This legacy has totally defined what UNIX is, and what it is good for -- and bad at. It wasn't good, it was free. But perfect is the enemy of good enough, and UNIX was always "Good enough" that people used it, added to it, and it sort of became a defecto solution until it got near universal adoption. These same pragmatic compromises are similar to why TCP/IP and HTML became the Internet.
What is a Web Application, and how does it vary from a traditional website? There's a joke in tech, "there is no cloud: there's just somebody else's computer": in other words, you're either using your machine, or someone else's. A traditional website is just you browsing some files (in the HTML format) on someone else's computer. And a Web Application is for things more complex than just reading files: the other computer has to be running an Application (remotely) to serve up some of the stuff you're asking for: like if you need to enter forms and have that do something, do complex lookups (searches of files), or basically do anything more complex than read/write information but interact it with in a more complex way. That's what Web Apps are for.