Loading previews...
Summary: | In this unit we provide a general introduction to number systems and discuss how numbers are represented by computers. We start with a look at the three main systems that occur in computing applications; decimal (base 10), binary (base 2) and hexadecimal (base 16) and describe methods for converting between these three bases. A (very) brief discussion is also presented on conversions involving other bases such as octal (base 8). We then apply the basic techniques we use to add and subtract decimal numbers to enable us to perform these operations manually for binary numbers. The discussion moves on to look at how computers store and represent positive and negative numbers and the concept of signed and unsigned binary numbers is introduced. We present different approaches used by computers for storing numbers with the focus on two’s complement representation. The unit closes with a brief look at a selection of bitwise operators, supported in programming languages such as Java and C, to operate on binary numbers at the bit-level by treating them as strings of bits. |
---|---|
Creators: | |
Divisions: | Academic > School of Computing, Engineering and Built Environment > Department of Computing |
Copyright holder: | Copyright © Glasgow Caledonian University |
Viewing permissions: | World |
Depositing User: | |
Date Deposited: | 11 Feb 2019 12:02 |
Last Modified: | 20 Jun 2019 14:05 |
URI: | https://edshare.gcu.ac.uk/id/eprint/4566 |
Actions (login required)
View Item |
Toolbox
There are no actions available for this resource.