Select Page

What is Coding/Programming: The Complete Step by Step Guide

Feb 22, 2018 | 2 comments

I think you’ll agree that:

We use technology on a daily basis.


…just because we use technology daily, it doesn’t mean we know how to use it.

Douglas Rushkoff, the author of Program or be Programmed, defines this issue perfectly.

“When human beings acquired language, we learned not just how to listen but how to speak. When we gained literacy, we learned not just how to read but how to write. And as we move into an increasingly digital reality, we must learn not just how to use programs but how to make them. In the emerging highly programmed landscape ahead, you will either create the software or you will be the software. It’s really that simple: Program, or be programmed.”

Douglas Rushkoff

If you agree with this quote or not, he does make a really good point.

When we were taught a skill, such as a language or reading, we were taught the back end of it – no pun intended.

As much as we use technology, it’s almost crazy not to know how it works, even if it’s only a little.

And today, I’m going to show you exactly what coding is and how it works with technology.


Now, you may be wondering:

What is coding?

Well, if you google search the definition of coding, you’d get:

With the definition on hand, it could mean writing a secrete message in code.

But in this instance, coding is the process of writing source code for a computer to run.

Still confused?

Well, let me explain.

But before I do, let’s define:

Basically, code is the instructions for your computer.

Let me translate:

Coding is the process of writing instructions for your computer while the code is just the instructions.

Now, you may also be wondering:

What is programming?

Instead of writing code, programming is the process of writing the program.

Now, let’s quickly define:

You see?

Programming is the process of writing the program while the program is the finally product.

It’s the whole process of developing the program from the time it’s an empty screen to a full working program.

Which ultimately brings us to:


When you write a program, you’ll have different ways of telling your computer step-by-step how to run the program.

An algorithm is basically that process.

If you defined it at its most basic level, an algorithm would the step by step process of accomplishing a given problem.

For example:

An algorithm in everyday life could be the steps to prepare a meal.

In programming, these steps are a lot more complex and complicated, but just like how you can take different steps to prepare the same meal, the same goes for programming.

You can use different algorithms to accomplish the same goal, but each one of them have their benefits and their drawbacks.

So why should you learn them?

There’s really 2 reasons why programmers study algorithms:

First, they can study different types of algorithms and start recognizing patterns.

Second, they can start implementing what would the best route to take when solving a problem with their program.

Now, you may be wondering why are algorithms so important?

Well, let me ask you:

Which one would you rather develop, Google or Bing?

Obviously, Google, right?

At the moment, Google has a market share of 74.52%, while Bing only has 7.98%.

But why is this the case?

In order to explain, we need go back to the 90’s.

Back then, before search engines were even a thing, there’s was NOTHING but plain websites.

To show you what it looked liked, here’s a picture of the first webpage:

As you can see, it’s nothing fancy. There’s no fancy pop up images, ads, or videos.

It’s just some plain texts and links.

And the internet was made up of these pages but there was no way to search for them.

It was honestly a complete mess…

Then came AOL and Yahoo.

They were good.

But it wasn’t until Google that really changed the game.

Google was the first one to use links from other sites to rank sites by popularity and relevance.

And it worked.

Since then, Google innovated their algorithms and changed how it works today.


By far, Bing is not a bad search engine at all.

But throughout time, Google always showed more relevant search results.

Even Bill Gates, Founder of Microsoft, which created Bing, said back in 2004:

“We took an approach that I now realize was wrong,” he said.

“Our strategy was to do a good job on the 80% of common queries and ignore the other stuff.”

“But that’s not what counts. It’s the remaining 20% that counts… because that’s where the quality perception is.”

Google was “way better”, he said, for people investigating a rare disease, exploring a hobby, or searching for a specific restaurant.

Bill Gates

Obviously, over a decade ago, they’ve made some major updates to their algorithms and gotten a lot better at making specific search queries.

But by the time they did, Google had already taken over the market.

What can we learn from this?

Was it because Google used some fancy algorithm?

Yes, but…

Was it because Google saw an opportunity to improve a problem?

Yes, but…

Both are equally important.

You need think of an idea that needs to be fixed or improve on one.

And you need a good algorithm to support it.

Google wasn’t the first one to think of the idea of organizing the web.

They just saw an opportunity to make it more efficient.

And developed the right algorithms to support their idea.

How does coding and programming work

Now that you have a basic idea of coding, you can give yourself a slap on the back.

Not many people even know the basics.

But when you know how coding works with your computer, then…

…and only then, you’ll be able to truly grasp what coding is.

And today, I’m going to help you connect the dots.

Let’s get started!


In order to understand how it works, we first need to cover:

Programming Languages

A programming language is a unique set of syntax rules to write or type certain code. It’s basically like how we write in our language, but instead of the grammar rules in our language, programming languages all have their own set of rules.

When you’re defining programming languages, you’re usually talking about a:

High-level programming language

High-level programming languages uses more programming concepts that are easier to grasp than lower level programming languages.

For example:

Python is considered to be a pretty high-level language due to its easy-to-read syntax.

But something such as C, while still a high-level language, it’s considered to be a lower level language than Python because the syntax can be a little more complex.

In contrast:

Low-level programming languages provide little to no programming concepts that are used in higher level languages.

An example of a low-level programming language is…

A machine language uses binary numbers to directly give your computer instructions. It’s almost impossible for programmers to write machine code because it’s all in numbers.

But that’s where an assembly language or a higher level language is required to write the code.

An assembly language is one step above a machine language. It uses the same set of instructions as a machine language but instead of using numbers, it uses names.

A computer can only understand machine level code.

So when you write the program in a higher level language, you have to convert it.

There’s 2 ways to go about this:

First, a compiler is a program that converts a high-level language into a lower level language.

When you use a complier, it takes the source code of a high-level language and turns it into an assembly language or directly to the machine code.

If the complier generates the assembly code, it requires a linker to combine all of the code and libraries, then it goes through an assembler to generate the machine code.

An assembler is a program that can only convert assembly code into machine code.

If the complier generates the direct machine code, it does all of the functions of a linker and an assembler, then it just generates the direct machine code.

Tech Differences does a good job at explaining the main deferences between:

Second, an interpreter is a program that directly executes the high-level language source code without having to compile it to machine code.

Again, Tech Differences does a good job at explaining the main deferences between:


No coding guide would be complete without giving you some examples of lines of code.

Here’s the ‘Hello World!’ source code in Python and C.

Beginner programmers often learn this as their first program because it’s easy.

All it does is show the words ‘Hello World!’



As you can see, Python reads more like english while C looks a little more complex.

Do you remember that:

Machine language uses binary numbers to give your computer instructions.

Well, those binary numbers consists of 1’s and 0’s that give instructions to your computer’s CPU.

But how does a computer’s CPU understand just a bunch of 1’s and 0’s?

Good question.

But before we can explain how a CPU works with code, we need to cover how:

Binary numbers work

Binary works a little different than what we are use to.

We use a system called decimal or ‘base ten’, which uses 0 to 9 to count.

But binary or ‘base two’ only uses 0 and 1 to count.

Let’s learn how to count in binary.

Since it uses 0 and 1 to count, we need to start with 0:

  • 0 = 0

Then you would add a 1 to the last 0 of every digit.

So: 0 + 1 = 1

  • 1 = 1

Now what? Well, we just use the next number that we can use; 0.

  • 2 = 10

Then add a 1.

  • 3 = 11

Can you guess what comes next?

If you said 100, you would be right.

  • 4 = 100

Just how normally 10 would be after 9, binary goes from 1 to 10 and 11 to 100.

Let me guess:

You’ve been saying 100 like one hundred.

Well, you actually say it as one zero zero.

Now, let’s continue to count.

  • 5 = 101
  • 6 = 110
  • 7 = 111
  • 8 = 1000
  • 9 = 1001
  • 10 = 1010
  • 11 = 1011
  • 12 = 1100
  • 13 = 1101
  • 14 = 1110
  • 15 = 1111
  • 16 = 10000

I hope you can see the pattern.

Instead of the digit moving every time you multiply by 10 in base ten, the digit moves multiplying by 2.

For example:

  • 2 = 10
  • 4 = 100
  • 8 = 1000
  • 16 = 10000

But how would you convert binary to decimal?

Let’s take random binary number, such as 10111.

Then you take the 1 from the right and multiply it by 1. Then every time you move to the left, you would end up multiplying it by 2.

As so:

  • 1 x 1 = 1
  • 1 x 2 = 2
  • 1 x 4 = 4
  • 0 x 8 = 0
  • 1 x 16 = 16

As you can see, the numbers from the left is the same as 10111 but from the down up.

Now, you would add the numbers from the right to get the decimal number.

  • 1 + 2 + 4 + 0 + 16 = 23

For fun, let’s continue counting from 16 to 23 to see if we were correct.

  • 16 = 10000
  • 17 = 10001
  • 18 = 10010
  • 19 = 10011
  • 20 = 10100
  • 21 = 10101
  • 22 = 10110

And like clock work:

  • 23 = 10111

Okay, now that you know how to count in binary…

How does this help me understand how a computer uses them?

Well, computers can only understand 2 basic instructions; on or off.

Now that I confused you more than even before, let me explain.

1 = on


0 = off

But wait, how does a computer understand on or off?

Well, that’s where a transistor comes into play.

Transistors controls currents with a electronic switch.

When the computer receives an instruction of 1 or 0, a switch gets flipped.

All About Circuits shows this perfectly in diagram.


In the picture above, it shows the switch is at 5 volts for the input and the output being 0 volts. Binary number 1 usually represents 5 volts.

How about when the switch gets flipped?


As you see, it’s been flipped. The input is now 0 volts and the output is 5 volts. Binary number 0 usually represents 0 volts.

Now, I’m going to quote All About Circuits when I say this:

“What we’ve created here with a single transistor is a circuit generally known as a logic gate, or simply gate.”

They showed what a logic gates does, which bring us to:

Logic gates

Logic gates are made up of tiny transistors and control the flow of electric current in your computer.

There’s 7 main logic gates, such as AND, OR, XOR, NOT, NAND, NOR, and XNOR.

We’re going to look at an AND gate and an OR gate today because it’s important you know the basic function of them and not what each one does.

Here’s what the logic gate AND looks like:

Each logic gate has a truth table to show you how it functions.


The truth table shows us that if the input of the gate is 0 on A or B, the output will be 0.


The input has to be 1 on both A and B for the output to be 1.

Now, let cover the OR gate.


This truth table now shows us that if the input is 0 for A and B, then the output will be 0.

But if the input is 1 for A or B, then the output will be 1.

Now that you know what logic gates are, let’s show you what combinational circuits are.

Combinational circuits are basally a combination of different logic gates.

There’s different forms of combinational circuits, such as an adder and a subtractor.

But today, I’ll just show you an adder.

As you can see, the logic gate AND and OR is a part of the adder. So let’s look at the truth table to see how the electric current will flow through it.


Since it’s a combination of different logic gates, it can get a little confusing.

So here’s a visualization of it:

As you can see, it’s a lot easier if you visualize it and break it down into smaller parts.

According to Wikipedia, “An adder is a digital circuit that performs addition of numbers. In many computers and other kinds of processors adders are used in the arithmetic logic units or ALU. They are also utilized in other parts of the processor, where they are used to calculate addresses, table indices, increment and decrement operators, and similar operations.”

As I said before, there’s different types of circuits and they all have their own function to help the computer work.


I know that was a lot, but I hope I could show you how different functions in a computer works with coding.

What does it take to code or program

Have you ever wondered:

How do you develop a program?


What does it mean to be a programmer?


How many lines of code are in huge programs that we use everyday?

Well, you’re in luck.

Because I’m going to answer all of these questions in a second.

Let’s start with the first question:

How do you develop a program?

Tip #1: you’ll want to define why you want to build the program.

For example:

Do you want to automate payments?

Do you want to design a fun game?

Do you want to automate something on your computer that you’re tired of doing?

Tip #2: you’ll want to come up with a plan.

Did you know?

80% of program projects fail because of missing functions, late, and over budget.


30% are canceled before it was even completed because of poor planning.

A lot can go wrong in a blink of an eye.

And I’m not exaggerating.

So it’s important to do your research.

Look at similar programs.

For example, let’s say you wanted to develop a program that took a screenshot of your computer screen.

You would first find similar programs.

Then you would go through each one of them and see what features they have.

The screen shot above was actually done by Skitch.

Let’s compare the same screenshot in Lightshot.

It’s the same feature. But the arrows are designed differently.

It’s really the little features that you should be looking at though.

See what you like about each one.

See what you didn’t like about each one.

And then set up your own plan.

Write down what features you want your program to have or don’t want to have.

See if you can think of a new feature that can help people take better screenshots.

And ultimately, a plan will set you up for:

Tip #3: Code the program

Follow your plan and change it as you go.

You’ll want to code at least a solid foundation.

Because you can always change it later.

You can add or even delete features as you please.

As long as you don’t infringe on others people’ work, you do whatever you may like.

And once you have solid foundation for your program…

Tip #4: you’ll want to test the program.

Try to fix as many bugs as you can before you release it.

Because as you going to figure out, new bugs will appear and often.

Bugs go hand to hand when you’re developing a program.

Once you done coding and testing…

Tip #5: you’ll want to maintain it.

Polish it up before you make it live.

Because if you want people to use it, it has to be good.

Once its live…

Then you’ll want to keep updating it as much as you can.

Fix flaws that you may not have found before.

And make it a great program for people to use.

Maybe, you can even sell it and make some money off of it!

Now that you know some tips to developing a program…

Let’s move onto the second question:

What does it mean to be a programmer?

In order to be efficient at programming, you need the technical knowledge with a combination of different skills to really succeed at coding.

But what are the skills you need to be good at coding?

Problem solving

When it comes to coding:

It doesn’t matter if it’s designing a whole program, putting words on a screen, or solving a problem with the code…

You’re always trying to solve a problem.

So it’s important to know how to solve problems.

It pretty much goes hand to hand with coding.

John Sonmez lists it as the number 1 skill you should have and says:

“Software development is 100% about solving problems. Without problems there wouldn’t be a need for software.”

John Sonmez

Programmers often have to think ‘why is this not working’ or ‘how can I get my program to work’.

And knowing when to google or to ask is sometimes necessary.



You’ve spent numerous hours, maybe even days on a part of a program.

And it breaks…

You’re going have to restart from scratch, which is usually the case the first time you run it.

All that hard work goes down the drain.

And that’s why you need patience and self discipline.

Can you even Imagine how difficult it is to put all your hard work into something for it just to break?

It’s the worst feeling ever.

But it’s worth it.

Because when it does work, you’ll get the best feeling in the world.

And you never know how that project will come out to be.

Maybe, it will be the next big thing.

Ability to self learn

Coding is changing faster than ever.

What worked yesterday may not work today.

So you need the ability to self learn and have the drive to obtain knowledge.

Because even when you’re trying to solve a problem with your code for hours, you may have to go and learn new techniques while being drained.

And let’s face it, coding is no easy task.

With the fact you have to solve problems and have the patience of a monk, you have to stay on the top of your game.


Read blogs, learn the newest technique, be diligent, and make sure you’re the one who people goes to for help.

Now for the last question:

How many lines of code are in huge programs that we use everyday?

Have you ever wondered:

What does it take to code programs such as Photoshop, Google, Google Chrome, Facebook, Windows, or even an app?


Probably not.

But here’s a graph, anyway.

It shows you how many lines of code are in each program.

I actually got the idea for this graph from the site, Information is beautiful.

I use all of the data from their infographic, which you can find right here.

Note: Some of this information is outdated, but that REALLY doesn’t matter.

Remember, programs are constantly changing, through updates, bugs, and new features.

So don’t nitpick this graph.

Use it as a rough estimate, or guide as you would say, to give you an idea of what it would take to write these programs.

As you can see, writing a heavy-duty program is no easy task. It takes millions, if not billions, lines of code to write the programs that we use everyday.

Obviously, this isn’t a overnight process. It takes years, whole teams, various of planing, and a lot of money for these programs to be as good as they are.

But don’t worry.

A simple app can talk A LOT less than some these heavy duty programs.

As it says in the chart, an app can take around 40,000 lines of code.

And for websites, there’s great opinions to work with pre-coded templates where coding will definitely help you out to build a beautiful design.

But at least, you won’t have to start from scratch.

And you can learn from them.

Why should you learn to code

An online survey done by Rasmussen College randomly questioned 2,009 adults and found:


"People who think the internet was overwhelming.”


"People who can't live without the internet.”

That means there was:


"People who think the internet was overwhelming and can't live without it.”

They also found:


"18-34-year-old respondents said they find the internet scary…"

Those findings are actually kinda sad.

We are glued to our electronics for hours, and the fact we don’t know how to use them is just pure ignorance.

But if you learn how to code:

You’ll learn how a computer works.

Coding will give you a better understanding how your computer works.

I hope you have a better idea after reading this article.

But if you get good enough, you’ll really learn the ins and outs.

You want to automate a process that usually takes you hours?

No problem, build a program.

Do you want to be more secure online?

Learning code will teach you the best practices for being more secure.

You see?

Programming is a way for you to be more productive and secure.

I mean just look at these arguments done by the Telegraph:

“Becoming literate in how the technical world works is equivalent to reading, writing and maths. We need to look at this fourth literacy as mainstream,” he says.

“Not just at the level of the very basics of operating a computer but actually understanding how the code and mechanics behind it work. In the same way that if all you had was oral communication and you didn’t have writing, you really wouldn’t understand the logic of our society.”

Mark Surman

Executive Director of the Mozilla Foundation

And that brings us to:

Learning how to code will make you smarter

When you’re developing a program, you need to have the end in mind, such as building a game.

But getting there:

You use a skill called computational thinking.

“Computational thinking allows us to take a complex problem, understand what the problem is and develop possible solutions. We can then present these solutions in a way that a computer, a human, or both, can understand.”

It allows use to think of the steps to solve a given problem.

Besides just solving the problem, a programmer has to break it down to its very core.

They have to think how a computer will be able to solve and understand a problem such as a math problem.

And they have to analyze and gather data for the problem on hand.

Then they build a algorithm off of the data for the computer to solve the problem.

For example, if you wanted to create a video game, you’ll need some physics knowledge.

Let’s say you created a game that you had to balance a ball on a platform.

You need to take into account if I move the platform X amount, the ball will move Y amount of space.

While taking into account how heavy the ball is and on and on and on.

If you get good at computational thinking and combine it with coding, it can improve your problem solving skills and will make you more intelligent overall.

Now, let’s move onto:

Job opportunities

Did you know?

A report done by Burning Glass states 20% of the total job market that make $15 an hour or more requires some kind of coding skill.

Whoah, that’s a lot!


They also state programming jobs are growing 12% faster than the market average.

These are not just programming or tech jobs either.

They say that half of all programming jobs are not in tech industry.

But instead…

Finance, Manufacturing, Health care, and many more.

As you can see, coding is becoming a very important skill in the job market.

There may come a day where coding will be as important as reading and writing.

I mean what business doesn’t use some form of tech?

Not a SINGLE one.

And if know how how to code, you’ll learn how to make more money as well.

You can start your own business by building a program and monetizing it.

You can also create a website and use it to sell products. There’s also many different ways to monetize a website.

Now finally, but most importantly:

It can be a lot of fun.

Yes, it can be challenging.

But learning a new skill such as coding will give you a greater sense of accomplishment in your everyday life.


I hope this guide could help you grasp the very essence of what coding and programming is.

It’s really not that confusing when you break it down into smaller parts.

And it’s pretty interesting to see how it all works with technology.


What are your thoughts on coding?

Are you going to start learning it?

Let me know by leaving a quick comment.