2 Answers
I'm getting this from Intro to PC and Macintosh:= Hard Drive and Read Only Memory (Ram) use the same units of measure, which are:
Bit: Binary Digit. The smallest unit of info a computer can hold. The value of a bit (either 1 or 0) represents a two-way choice, (on or off, true or false, black or white, etc.), and a 1 Bit X 8=1 Byte.
8 Bits = 1 Byte-1 Byte, A Byte is equal to 1 character of text.
Kilobyte=1024 bytes That is 1024 Characters of text stored.
Megabyte=1024 kilobytes, so if 1 megabyte equals 1024 kilobytes then that would be 10 million, 48 Thousand, 5 Hundred and Seventy-Six. 1024X1024-,1 10,48,576 o units of text stored.
Gigabyte=1024 Megabytes of text stored, therefore 1 Gigabyte= 1024X1024X1024, so if you rounded a Megabyte down to 10,000,000 characters of text (letters and Numbers, etc.)and multiplied it by a rounded Gigabyte, you'd get 1 millionX1000-! Billion characters of text stored=one-1 Gigabyte, because a thousand million equals 1 billion.
3.5 megabytes = 3.5x1024=3584 kilobytes=3584X1024=36,700,016 Bytes or 36 million, 700 hundred thousand and sixteen Bytes.
This seems to be correct, but you can check it yourself by applying the above units of measure.
I hope I don't have to go thru all this to split the net net profits at http://www.mcbikergear.com 50/50.
12 years ago. Rating: 1 | |