General Question

ScottyMcGeester's avatar

How can computers do what they do?

Asked by ScottyMcGeester (1897points) April 3rd, 2013

This is a weirdly worded question but let me explain.

One thing I never understood is how, for example, right now I’m looking at this screen and I’m on the Internet and I’m using Windows as an operating system, all that good stuff, right?

How do we get that from a bunch of chips inside a box? I don’t understand the whole “translation” step, so to say, of how the physical hardware brings up the software and the programming. I mean, how did people figure out how to make this anyway?

Observing members: 0 Composing members: 0

6 Answers

gorillapaws's avatar

It’s things building onto the work of others, like layers. The beauty of a system like this is that a programer or engineer needs to only really understand the complexities of their particular layer. So the browser your viewing was coded by a bunch of application level programmers at Firefox (for example). They are using what’s called API’s provided by Microsoft’s operating system that can do things like open a new window, print, save a file to disk etc. The Firefox programmer doesn’t need to know how the Microsoft team’s code works to open the window, or save the file to disk, they only need to understand how to use the API properly. This separation of concerns allows them to focus only on coding the best browser they can and not have to worry about what brand of hard disk the user has when they click save image to disk. Likewise the guys at Microsoft have different teams and different layers that go all the way down to “the bare metal” as it’s known. Each layer’s functionality is encapsulated.

Nobody could trace the path of an electron coming from the power cord through the processor, and the full execution of code stack into the application level logic to produce the text you’re currently reading on your monitor right now. It’s simply too complex for anyone to wrap their head around.

jerv's avatar

This is why you have software people and hardware people, and each of those are split. Each layer of abstraction is complex in and of itself; even a super-genius would have problems learning each layer in detail.

Lets put it like this; a person hanging sheetrock walls knows a little bit about framing a house, very little about pouring the concrete that makes the foundation that the frame of the house is erected on, and probably next to nothing about the chemistry to make concrete from raw materials. Computers are like that; many layers each building on the one before it, and only the very smartest knowing even half the levels.

XOIIO's avatar

0 and 1

That’s how every computer, microcontroller, or any computerized system does anything when you get down to it.

gorillapaws's avatar

To piggyback off of @XOIIO‘s answer, you can think of each 0 and 1 as a switch that’s either on or off. So the text in this answer is really hundreds of “switches” arranged in a way that we all agree represents letters of the english alphabet according to the UTF-8 Standard, and the mp3 is just millions of switches configured to a standardized format called MPEG-2 Audio Layout III.

dabbler's avatar

The design principle of abstraction lets people build on the work of others.

You can walk to someplace. You can abstract the function of travel and use a horse or a boat or a car or truck or bus to get from one place to another.

So we have cars, and everybody knows how they work. There is your steering, your accelerator, there are the brakes. Abstracting that functionality frees us from knowing how it works. People can say, hey let’s make a truck, or a bus or a formula one racer.

You can do maths with numbers written on paper, and you can move information by taking a folder of papers out of a drawer in a cabinet and putting it a different place, another drawer or cabinet, or on your desktop. Display of information on paper takes advantage of ambient light reflected off a surface just like the ambient-light LCD grids on some of the most useful e-readers.
You can do your maths in a modern processor, to decode a movie maybe, or remove the red-eye from the picture of your fiancĂ©e’s party face, the data are now representations of information in number form, the transfers go from place over circuits and cables. Display on most computers these days is some form of LCD grid circuits, backlit with lamps.

Computer function is abstracted into an operating system, like MSWindows or MacOS or Android. That basically gives us the abstracted abilities to move data, transform data, and display data.
On top of that abstracted facility people can build apps, a spreadsheet program or a streaming video player or a photo editor. Or angry birds.
You can abstract the functionality of the spreadsheet program into a macro language and people will build applications on top of that to run in the spreadsheet engine.
You can abstract the basics of human life and implement them in ‘Second Life’ and people will build mansions with their mice.

Specialized hardware, like an HDMI port, abstracts the capability to send a lot of data fast over a limited distance. The operating system can assume that will do what it is supposed to, and abstract that into a data channel that a movie decoder can use to pour its output down the throat of the HDMI and out to your tv/monitor.
On the other end specialized hardware abstracts being able to send large amount of data long distances, through relay nodes, to a specific other device. That networkfuntionality is usually abstracted into the ethernet standard (802.11) so the OS doesn’t have to care if the data might go over wires or Wifi radio or a power line, or a fiberglass cable.

LostInParadise's avatar

It helps to see how the pieces fit together. As a computer programmer, I have only a rough understanding of how the hardware fits together, but it helps reduce the one big mystery to several smaller ones. At the risk of making a complete fool of myself, here is what I know. Any hardware engineers here who feel compelled to make corrections, jump right in.

Most computers follow what is called a Von Neumann architecture, named for the mathematician and physicist John Von Neumann, who first thought of it.

Both a computer’s instructions and the data that it stores are placed in memory. There are two important things to know about memory – it consists of groups of ones and zeros and it is addressable, so that, given the address of a memory location, the computer can retrieve its contents. The only thing that the memory can be used for is storing and retrieving its contents.

It is easiest to imagine a computer program as being in sequential locations in memory. There is a piece of hardware referred to as the program counter, that keeps track of the location of the current program instruction. In the usual case, it is bumped by 1 each time to retrieve the next instruction. Some of the instructions can override this by causing the counter to point forward or backward.

Each instruction is fetched into a piece of hardware known as a register and translated into a sequence of logic operations. How this is done is a great mystery to me. Typically, data is retrieved from one or two memory locations into registers and transformed by the instruction and then put back into a memory location. It is important to see how everything can be turned into logic operations. Suppose, for example, that you want to add two binary numbers. We will take 1 as standing for true and 0 as false. We want the sum of the two bits to be 1 if they are different and 0 if they are the same. There is a logic operation XOR (exclusive or) that performs just this operation. We want the carry to the next place to be 1 if both the bits are 1, which is an AND operation on them.

That, in very crude terms, is what goes on. I hope that makes it a little less mysterious.

Answer this question

Login

or

Join

to answer.

This question is in the General Section. Responses must be helpful and on-topic.

Your answer will be saved while you login or join.

Have a question? Ask Fluther!

What do you know more about?
or
Knowledge Networking @ Fluther