I agree with the other comment that this is very open ended. I'm curious to hear more details.
[1] Computer history is a fascinating and complex topic. To be extremely general, first mathematicians like Ada Lovelace toyed around with different niche ways of doing math (i.e. binary and boolean logic) and they thought about using newly discovered electricity to crunch big numbers. Then Alan Turing came up with a theoretical model known as the Turing machine to describe how any computer works at a low level, he also invented the first digital computer to help crack the enigma code in WW2. In the 60s-90s there was a boom of technology, the computer was refined into what we'd recognize today, standards were ironed out, and the internet was born. One big name from this age was Linus Torvalds, inventor of linux. You can think of linux like a competitor to windows. Then we get into the modern age where everyone gets online and buys computers and you should hopefully know all the major milestones from here.
[2] To understand a computer you must think in levels of abstraction. You can't worry about how efficiently your hardware is running while trying to troubleshoot a YouTube video. I'd say the main layers are hardware, firmware (which includes you operating system), software, and networking. All of these systems work together to bring this message from me to you. I agree taking an introduction to software development course online would be helpful.
[3] On top of learning the basics of coding from w3 schools or geeks for geeks (python is an easy language to learn) I'd recommend lurking on computer/coding focused subreddits. There's a ton and you'll see the lingo in context.
no subject
[1] Computer history is a fascinating and complex topic. To be extremely general, first mathematicians like Ada Lovelace toyed around with different niche ways of doing math (i.e. binary and boolean logic) and they thought about using newly discovered electricity to crunch big numbers. Then Alan Turing came up with a theoretical model known as the Turing machine to describe how any computer works at a low level, he also invented the first digital computer to help crack the enigma code in WW2. In the 60s-90s there was a boom of technology, the computer was refined into what we'd recognize today, standards were ironed out, and the internet was born. One big name from this age was Linus Torvalds, inventor of linux. You can think of linux like a competitor to windows. Then we get into the modern age where everyone gets online and buys computers and you should hopefully know all the major milestones from here.
[2] To understand a computer you must think in levels of abstraction. You can't worry about how efficiently your hardware is running while trying to troubleshoot a YouTube video. I'd say the main layers are hardware, firmware (which includes you operating system), software, and networking. All of these systems work together to bring this message from me to you. I agree taking an introduction to software development course online would be helpful.
[3] On top of learning the basics of coding from w3 schools or geeks for geeks (python is an easy language to learn) I'd recommend lurking on computer/coding focused subreddits. There's a ton and you'll see the lingo in context.