IT Basics

Binary and UX/UI

Hello World! In this blog I will discuss UI/UX and Information Technology basics such as the relationship between binary and computer hardware.

Human-Computer Interaction

HCI is the study of how technology interacts with people. This study arose in the 80's when the first personal computers started popping up on the market.

HCI has two disciplines. User Interface which is how visually appealing and simple to use an application is. And User Experience where an app has the functionality it needs to appeal to the market.

There are four principles for HCI

Usability

How easy is the application to learn for new users.

Effectiveness

How well does the application help with a task. What percentage of the goal is the application able to achieve.

Efficiency

How quickly and how many resources does the computer use when doing the task. This may not seem important for a single user but in companies, hundreds of the same process are being done so the runtime and resources add up.

Satisfaction

How happy is the user after using the application. Were they satisfied with every step of the process.

By successfully following these four principles you will be well on your way to have a successful human-computer interaction.

Binary and Hexadecimal

Let's take a step back to get into computer basics and learn how they really run.

Computers use the binary system. This is because a computer's data exists as physical signals such as electrical charges, positive and negative, which can be represented by 1's and 0's. The computer turns the physical signal into digital data. Then the computer can use that digital data to perform computations and calculations.

A bit is an individual 0 and 1 in a binary sequence. A byte is a group of 8 bits and can range from 0 to 255.

Hexadecimal uses 16 digits, its use is for how well hex translates to binary. if you have a long binary you can split it into groups of 4s and have a simple readable representation for binary.

ASCII and UNICODE

Now that we know how computers can talk, we need something to translate it to languages humans can understand. Thats where ASCII comes in. ASCII is a 7-bit code that can translate binary into english. Eventually we needed more characters like accents so people started using the full 8 bits to add more characters.

This system worked well enough until sending files internationally became more available. Unicode was invented to fix this issue. Unicode uses 16 bits so computers around the would can access files from around the world without any issues.

In the next blog I will go into how different media types are processed into binary and how to format them.