PDA

View Full Version : asgard/ancients/wriath/gould only base 2 number system for computers



Gargen
July 26th, 2005, 12:43 AM
well in serveral episodes, there has been an encryption from another race and someone general carter cracks it, but cracking techniques that she would know would only apply to computers with base 2 number systems (0,1) so if they are able to crack alien encryptions then that would mean these races that are far superior technologically are still using this which is extremily limited. This also applies to the intergration of computers at atlantis. So my question is why is this is it that they dont need it or it is a much more advanced thing to add an extra digit to your systems?

Spoiler warning thingy highlight
if you dont think they are using it look at the wraith virus code its all 0 1

valha'lla
July 26th, 2005, 04:26 AM
Thats cause a commputer has the simple on off postions so it can only use two repersentations which are 0 and 1 its like if u had two fingures u can only have two digits before u need another person to count the comlexty of the proram itself is the problem as all compture systems would use binary lanuage 1010010 kind of thing. Its life the race with the advanced mine weapons they realised they needed something to repersent having nothing or 0 so all races would have to do this to create that kind of tech like all alien computers would still only have to use binerary.

LiquidBlue
July 26th, 2005, 07:20 AM
I think the point is that any base-n number can be represented in base 2. I am not sure in which epsiodes Carter had to break an alien encryption; it seems rather incredible. Was the alien program written in some sort of interpreted language. Trying to figure out how an alien program works by looking at the machine code seems to strain belief.

Speaking of nitpicks. What did McKay do, so that all of the sudden he could see that the Virus was written by the wraith? Just interpreting the code into some wraith character set, in an of itself does not prove anything. Couldn't he use a Japanese character set just as well and blamed it on those cunning people to the east, or couldn't he have used a Asgard character set and blamed it on the traitor within their midst!

Ohper
July 26th, 2005, 09:40 AM
It's probably for the same reason why (just about) everyone speaks English. They don't want to devote half the 40 minute show to translations/decoding.

_Owen_
July 26th, 2005, 09:46 AM
Now, actually, it is more common for a computer to use hexadecimal math as a language, Base 16. However, that is not the point, many times translation subroutines are run, like in "Intruder."

Owen Macri

Gargen
July 27th, 2005, 06:53 PM
they well at least the writh do have a base 2 number system, ive seen it but the wratih arent very technoolgically advanced compared to the other races i metioned

Carter had to break the encryption in episode 116 nightwalkers

Avatar28
July 27th, 2005, 07:29 PM
Now, actually, it is more common for a computer to use hexadecimal math as a language, Base 16. However, that is not the point, many times translation subroutines are run, like in "Intruder."

Owen Macri

Nope, computers do everything in Binary. Hexadecimal is used in many programming languages as shorthand because it's equal to 2^4, but the computer itself still runs on binary. 0, 1, off, on. It's likely that if the advanced races use digital circuitry (and I see no reason why they wouldn't for most stuff) which would inherently use binary. Before anyone goes shouting about quantum computers and qubits, I will point out that quantum computers aren't the end all/be all of computers and they're really not that great for most types of calculations.

_Owen_
July 27th, 2005, 11:52 PM
In "Intruder" The Wraith code was translated into binary so it could interact with the Daedalus. Mckay translated it back to Wraith.

Owen Macri

_Owen_
July 27th, 2005, 11:53 PM
Nope, computers do everything in Binary. Hexadecimal is used in many programming languages as shorthand because it's equal to 2^4, but the computer itself still runs on binary. 0, 1, off, on. It's likely that if the advanced races use digital circuitry (and I see no reason why they wouldn't for most stuff) which would inherently use binary. Before anyone goes shouting about quantum computers and qubits, I will point out that quantum computers aren't the end all/be all of computers and they're really not that great for most types of calculations.
Yes, ok, thank you very much, because that seems much more likley, I just remember hearing somwhere that computers did, it must have been some programing languages. Sorry about that. Binary does seem more likely to operate a computer.

Owen Macri

~Thor~
July 28th, 2005, 01:09 AM
I doubt advanced races would use binary. I think there computers would operate more like the brain, since that has a lot more processing power than any human-built computer.

valha'lla
July 28th, 2005, 01:27 AM
computer that worked more like a brain is a possibility, but then again making a computer like that is geting close to AI and we all know the possible consiquences of that but it still probably have to use some form of binary as a base as it still would only have the on off posstions to use.

LiquidBlue
July 28th, 2005, 07:46 AM
I think a visit to Wikipedia's article on Turing Machines would be useful to this thread.

http://en.wikipedia.org/wiki/Turing_machine


the thesis states that our notion of algorithm can be made precise (in the form of computable functions) and computers can run those algorithms. Furthermore, any computer can theoretically run any algorithm; that is, the theoretic computational power of each computer is the same and it is not possible to build a calculation device which is more powerful than a computer. (Note that this formulation has implicit in it the idea that memory/storage is separate from device; any actual computer has finite memory, but the formulation always assumes that memory can be added at will.)

This basicly means that my calculator, given enough memory, could run any algorithm the greatest supercomputer in the world could. In fact my calculator could emulate the supercomputer. Ofcouse my calculator would take a very, very, very long time.

Quantum computers and the algorithms they use do not fall into this category precisely. (I suppose even a quantum computer could be emulated by a turing machine, one would just need to model the particle's interactions and evolution with time. :) )

As a side note, In Arthur C. Clake's, "3001," it is revealed that humans eventually understood the mathematical properties of computers so well, that they were able to create viruses and attacks that were architecture agnostic, that instead exploited weakness inherent in computational devices in general.

--

My nitpick still stands, simply using a Wraith/Ancient character set to view the virus does not demonstrate that the virus was Wraith in origin anymore than viewing it with a cyrillic character set demonstrates that the virus was Russian in origin.

Avatar28
July 28th, 2005, 11:21 AM
I doubt advanced races would use binary. I think there computers would operate more like the brain, since that has a lot more processing power than any human-built computer.

Actually, digital computers are fairly rapidly approaching the computing power of the human brain. We're still not there yet, but I believe I've read that within the next 20 years or so we should be approaching that point.

Neural networks are good but, again, not for everything. For example, as powerful as it is, even a slow digital computer is much faster at general calculation type functions than the human brain. Neural networks are good with things like learning, pattern recognition, and the like. For most general purpose needs, digital computers are still superior. And unless you have more discrete states than off/on, you'll always be faced with the fact that they use binary.

_Owen_
July 28th, 2005, 09:44 PM
computer that worked more like a brain is a possibility, but then again making a computer like that is geting close to AI and we all know the possible consiquences of that but it still probably have to use some form of binary as a base as it still would only have the on off posstions to use.
Oh, ya, we absolutley cannot let antificial sentient life be created, COUGH Battlestar Galactic COUGH Cylons COUGH tons of other science fiction shows COUGH including I, Robot... sorry, I can't cough anymore.

Owen Macri

VirtualCLD
July 29th, 2005, 08:00 AM
Actually, digital computers are fairly rapidly approaching the computing power of the human brain. We're still not there yet, but I believe I've read that within the next 20 years or so we should be approaching that point.

I would have to respectfully disagree. I think we have a very, very long way to go before digital computers even approach the level of complexity of a human brain. While I agree that digital computers are much faster, they are still dumb machines that just do what you tell them. They can not learn like we do, and AI is more like Artificial Stupidity more than Intelligence. While many an AI can emulate intelligence, that's all they can do, they can't really think for themselves.

I remember going to a talk on bipedal walking for robots and going over the different control systems used. To us, walking come naturally, after we learn it, and we can traverse almost any surface, regardless of friction. It's all done subconciously, without us even knowing that we are making subtle adjustments as we move from a carpeted floor, to wood floor, to a tile floor, to a dirt ground, and then rock, etc... But making robots do this is extremely difficult. Even Azimo sp? honda's robot, can only walk on specific surfaces and can only climb stairs with a specific step hieght.

And what's with this idea that all robots must turn on their masters? I mean, no all humans become criminals and killers, so why must all robots? That's always seemed like a cliche plot to me.

EDIT: Sorry, bit of an off-topic rant there. You can always go from one number system to another, that's not difficult. The basic principle of digital computers in RL (as previously stated) is that a 1 represents a switch that is on, and a 0 represents a switch that is off (actually it could be the other way around, or any number of things, but basically what I'm saying is that a computer deals with two voltage levels on a single line (a bit) a high voltage level, and a low voltage level). Now for the purposes of this post, let's assume, 5V = 1 and 0V = 0. This is why the base 2 number system is used. Theoretically, an alien computer (that for our purposes runs on electricity) and is able to use a base three system because it uses three voltage levels, say 5V = 2, 2.5V = 1, and 0V = 0. We could still interface with this computer by using an analog to digital converter so we could detect those three levels and convert them into a base two system (5V = 10, 2.5 V = 01, 0V = 00). It's plausible, but I would agree that the time would take to figure out what base number system an alien computer using, what are the various machine codes it uses, how it stores and retrieves data, would take a very long time.

_Owen_
July 29th, 2005, 08:02 AM
I would have to agree with Liquid Blue. It may seem like we are getting close, but the few steps we need are more like incredibly large jumps, over canyons filled with hot lava.

Owen Macri

Three PhDs
August 6th, 2005, 05:20 PM
I would have to respectfully disagree. I think we have a very, very long way to go before digital computers even approach the level of complexity of a human brain. While I agree that digital computers are much faster, they are still dumb machines that just do what you tell them. They can not learn like we do, and AI is more like Artificial Stupidity more than Intelligence. While many an AI can emulate intelligence, that's all they can do, they can't really think for themselves.

I remember going to a talk on bipedal walking for robots and going over the different control systems used. To us, walking come naturally, after we learn it, and we can traverse almost any surface, regardless of friction. It's all done subconciously, without us even knowing that we are making subtle adjustments as we move from a carpeted floor, to wood floor, to a tile floor, to a dirt ground, and then rock, etc... But making robots do this is extremely difficult. Even Azimo sp? honda's robot, can only walk on specific surfaces and can only climb stairs with a specific step hieght.

And what's with this idea that all robots must turn on their masters? I mean, no all humans become criminals and killers, so why must all robots? That's always seemed like a cliche plot to me.

I wouldn't say so. I believe in pure terms of processing power we've probably already exceeded the capacity of the human brain. What we are still a long way off from is an operating system with the efficiency and cunning of the human brain. That, I feel, is where binary computing lets us down.

_Owen_
August 6th, 2005, 05:22 PM
What the problem is, is that we know so little about the human brain, it is the most complcated part of the human body, there is so much that we don't know about it.

Owen Macri

~Thor~
August 6th, 2005, 10:55 PM
What the problem is, is that we know so little about the human brain, it is the most complcated part of the human body, there is so much that we don't know about it.

Owen Macri

I read a quote, which goes something along the lines of this:
"If the brain was simple enough to understand, we would be to simple to understand it"

_Owen_
August 6th, 2005, 11:15 PM
That is a very wise comment, who said that?

Owen Macri

Gargen
August 7th, 2005, 04:25 PM
ive actually contemplated this a bit more and running complex systems like those of the asgard would slow any computer at base 2 maybe even base 3 ideallythey would have a base 8 systems which there are programming langauges that are only 8 things brain**** being my example its very hard to do anything in but its impossible to do anything in in base 2 machine code but i doubt the writers of the show really had any idea someone would even notice this as its the kinda thing a normal person would not even give a **** about

_Owen_
August 7th, 2005, 05:45 PM
You have attempted to use profanity in your message, would you please delete the word which was not censored out, as it is against GateWorld forum rules to swear.

Owen Macri

cozzerob
August 8th, 2005, 01:14 AM
Please read this. I know it's very long, but I had to write it out twice cos my pc shut down before I posted it the first time. so please take the time to read it and ask any questions you may have.

We use the binary system (1 & 0) in our pc's for several reasons.

1) It is very simple to implement. Hardware that can check for the presence or lack of enery, ie, a 1 or 0 is very simple and easy to make. This makes it cost less.

2) As a format, the binary system is very portable. The 2-state system is very popular throughout nature and can be applied to various other things, for instance, we use floppy disks to store information. The floppy disks masically have lots of little magnets inside that are aligned either north or south, indicating 1 or 0 (I know it's a simplifies explanationm of floppy drives, but it will suffice for the purposes of this post). Cd's use pits and flats to store 1 and 0. Even digital tv uses the sending of a signal or the lack of one as 1 and 0 (again, massivly oversimplified, but in essence it's right).

However, the binary system for memory does have its drawbacks.

For instance, to store each character we use 8 characters in binary, ie, 6 = 00000110. The first five 0's are unnecessary in mathematical terms, but all computers store their information in sections called 'bytes', which are made up of 8 'bits' (which are either a 1 or 0). This is done because you it gives you 256 possible characters (10 numerical, 26 alphabetical and lots more). However, this does mean that to store one character in the decimal system, say 6, you need 8 times the memory to store in the binary system that your computers use!

This show's the two obvious drawbacks to the binary system.

1) It required lots of processing power because you need to process 8 characters to have one meaningful one, and

2) You need alot of memory to store all this extra information.

The alternative is to develop technology that can process discreet variables of energy values, ie, a value of 0, 1, 2, 3 or 4 gives you a 5-state system. Using this system has the benefit of using less space, ie, 6 in decimal = 110 in binary, = 11 in base-5. The larget the numbers, the bigger the memory saving.

So, the base-5 system has obvious advantages in form of less memory required, less processing power required. However, this system has it's own disadvantages.

1) Although the tech exists to create such memory technology, the cost would be significantly increased as it is much more advanced.

2) Implementing suce a system could be problematic, just from a logistical point of view. You would have some computers running base-5 and some base-2. And we all know how hard microsoft finds it to program for base 2, how will they copew with base 5?!

3) Many of the basic computer systems will have to be re-designed to cope with this new capacity - they were designed to use base-2 and simply would not work with base-5 (with out a conversion device, which defeats the whole point if your sending info to a peripheral device, ie, printer, but becomes even worse if it's on the actual motherboard of the pc - it becomes unworkable and a waste of time).

4) You lose the portability of the base-2 system. Floppy disks are out of the question, and cd's would need to be re-designed and the lasers made much more powerful to detect the difference between a flat, a full pit, and three types of pit inbetween.

5) Signal degredation now becomes a problem. One main advantage of digital (ie, base-2) is that it can be amplifies back to full quality. How many of you have digital radios. they are better than analogue because when you amplify the signal, you are amplifying 1 or 0 back to 1 or 0. It is easy to tell which is which, and the sound quality returns to recording quality. however, with an analogue radio, the further you are from the transmitter, the worse your signal. And becasue analogue can be any variable with in a set range, there is no way to tell what it was meant to be in the first place. The interference is amplifies along with the original signal, so you get bad-quality radio. The same is true of signals in the pc. It is much harder to chech that the pc hasn't made an error with a base-5 system. also, networking will be majorly more difficult, as you will not be able to go too far with your cables before you need a signal restorer. This would make networking near impossible.

6) It should now be pointed out that in any system where you want to use one bit to one character, ie, you want to represent a as a and 6 as 6 in the computers memory, that you will need a system with a huge base, ie, base 256, so that you can encompass all the characters. As you can see, base-5 is near impossible, so base-256 is just fantasy.

so, all in all, there are very good reasons to use the binary system, and i think that 'superior' races would still be using it'.

PS: This was all my own work, I didn't copy or paste etc, so i'm sorry about any spelling errors.

_Owen_
August 8th, 2005, 07:21 PM
Very nice post, very detailed, good work.

As for using higher bases, of course it would be pretty much fantasy, and we deffinetly aren't ready for them yet, however a compromise between the two might work. The computer will mostly run in base 256, however when encoding information to a floppy or cd, or with networking it transfers it to binary code. Sort of like, normally you have your clothes in drawers, but when you go on vacation you put them in suitcases.

Owen Macri

cozzerob
August 9th, 2005, 02:38 AM
Thanx.

What you say is possible, but again, might be quite tricky to implement properly. however, it might solve the processing power problem, it sould effectivly speed up our computers by 800%. That's not bad going (especially if you consider that our processing power is getting larger all the time, the 800% + the average advance in processing power would be huge). However, a problem arises when you consider the fact that you now have to redesign the computer (to use base-256), design from scratch translator equipment and design from scratch a RAM system that uses base-256 (because currently, the ram uses binary), and having to translate to binery from base-256 and vice versa when using the RAM would nullify the processing benefits of using base-256. Everything goes through the RAM. You'd probably slow down the computer!

_Owen_
August 9th, 2005, 04:10 PM
Lol 800%! I am getting one of those! Actually I will just build one. *Sneaks away to submit pattent* lol

Owen Macri

Avatar28
August 9th, 2005, 08:08 PM
I would have to respectfully disagree. I think we have a very, very long way to go before digital computers even approach the level of complexity of a human brain. While I agree that digital computers are much faster, they are still dumb machines that just do what you tell them. They can not learn like we do, and AI is more like Artificial Stupidity more than Intelligence. While many an AI can emulate intelligence, that's all they can do, they can't really think for themselves.

I remember going to a talk on bipedal walking for robots and going over the different control systems used. To us, walking come naturally, after we learn it, and we can traverse almost any surface, regardless of friction. It's all done subconciously, without us even knowing that we are making subtle adjustments as we move from a carpeted floor, to wood floor, to a tile floor, to a dirt ground, and then rock, etc... But making robots do this is extremely difficult. Even Azimo sp? honda's robot, can only walk on specific surfaces and can only climb stairs with a specific step hieght.

And what's with this idea that all robots must turn on their masters? I mean, no all humans become criminals and killers, so why must all robots? That's always seemed like a cliche plot to me.

EDIT: Sorry, bit of an off-topic rant there. You can always go from one number system to another, that's not difficult. The basic principle of digital computers in RL (as previously stated) is that a 1 represents a switch that is on, and a 0 represents a switch that is off (actually it could be the other way around, or any number of things, but basically what I'm saying is that a computer deals with two voltage levels on a single line (a bit) a high voltage level, and a low voltage level). Now for the purposes of this post, let's assume, 5V = 1 and 0V = 0. This is why the base 2 number system is used. Theoretically, an alien computer (that for our purposes runs on electricity) and is able to use a base three system because it uses three voltage levels, say 5V = 2, 2.5V = 1, and 0V = 0. We could still interface with this computer by using an analog to digital converter so we could detect those three levels and convert them into a base two system (5V = 10, 2.5 V = 01, 0V = 00). It's plausible, but I would agree that the time would take to figure out what base number system an alien computer using, what are the various machine codes it uses, how it stores and retrieves data, would take a very long time.

I said the power of the human brain, nothing about intelligence. The human brain is basically an analog neural network. Like other types of computers, they exceed at certain types of things and suck at others. It's estimated that to replicate the power of the average human brain would take a "normal" computer running at around 100 million MIPS (that is 100 trillion instructions per second). Storage is likewise best estimated at around 100 million megabytes (again, 100 trillion bytes).

Unfortunately, supercomputers are not measured in mips any longer but rather in giga (and now tera) flops which is a measure of how many calculations they can do per second, not necessarily how many instructions.

_Owen_
August 9th, 2005, 08:33 PM
When you put it that way you make the human brain seem simple! lol. The brain is very complicated even still, and as we can guess and estimate at some aspects of it, and know others, there are still huge uncharted regions.

Owen Macri

robinmdh
August 10th, 2005, 06:22 AM
Oh, ya, we absolutley cannot let antificial sentient life be created, COUGH Battlestar Galactic COUGH Cylons COUGH tons of other science fiction shows COUGH including I, Robot... sorry, I can't cough anymore.

Owen Macri
can't agrea with you on thath read Asimovs origional I robot
and perhaps some of his other robot books

First Law:
A robot may not injure a human being, or, through inaction, allow a human being to come to harm.

Second Law:
A robot must obey orders given it by human beings, except where such orders would conflict with the First Law.

Third Law:
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

there is also a zerowth law wich is made by a robot becouse he thinks the 3 laws are not enough and it's thath law wich allows the robots in the movie to hurt people but the robot who does that is obviusly malfunctioning.
0th law:
A robot may not injure humanity, or, through inaction, allow humanity to come to harm.

_Owen_
August 10th, 2005, 05:58 PM
Ya, I was just more reffering to the movie and how the robots decided to kill everyone, lol.

I still think it is to dangerous to create artificial life.


Owen Macri

Gargen
August 11th, 2005, 02:35 PM
its also almost impossible to design an algorithim that would create an AI (ive tried lol it made my headache)

_Owen_
August 11th, 2005, 02:49 PM
Lol, I want to try... wait, no I don't. Well, I do, but I only want to try for the challenge, I still don't want an A.I. However it could do my homework...

Owen Macri

Three PhDs
August 11th, 2005, 04:21 PM
its also almost impossible to design an algorithim that would create an AI (ive tried lol it made my headache)So when a neural net starts to learn... what's that?

_Owen_
August 11th, 2005, 04:42 PM
So when a neural net starts to learn... what's that?
Its' first day of school! lol.

Owen Macri

robinmdh
August 13th, 2005, 09:00 AM
So when a neural net starts to learn... what's that?
tecnically you can onely call it emergant behavior.

it's not direct intentional programing but the behavior emerges from a simple code. it's like when you see a flock of birds move as one, the don't one moves and the others follow...
it seems inteligent but is not (not in our sense anyway), oficially AI is suposedly proven if you can talk to a computer for over an hour and not know you'r talking to a comuter (thath'd rude Data out ;) ).

so a nural net is a bit of a grey area but it isn't AI (yet), it's just a self optimazing system.

_Owen_
August 15th, 2005, 10:55 PM
You are right, there are situations which a computer may learn, however I would not call it artificial intelligence.

Owen Macri