Computer Language & Terms
Jan. 26th, 2025 04:23 pm![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
![[community profile]](https://www.dreamwidth.org/img/silk/identity/community.png)
Hiya, I'm trying to discover more about the language of computers (coding, etc) as well as the terminology around what makes a computer. I plan to write a story from the perspective of a computer. This device will be a metaphor for being neurodivergent (autism, adhd, ocd, etc) and a state of anarchism.
[1.]
Do any of you know any historical figures, events, or devices that would be relevant to this subject?
[2.]
Any specific words related to computers/coding language that would be a good idea to think of/implement?
[3.]
Know of any websites or people I could get in contact with to help understand and possibly apply accurate knowledge to my story?
[1.]
Do any of you know any historical figures, events, or devices that would be relevant to this subject?
[2.]
Any specific words related to computers/coding language that would be a good idea to think of/implement?
[3.]
Know of any websites or people I could get in contact with to help understand and possibly apply accurate knowledge to my story?
(no subject)
Date: 2025-01-27 01:07 am (UTC)The first computers were basically fed code as a series of zeroes and ones (switch off, switch on) and at the very heart of things, this is what all computer instructions consist of. Unfortunately, this is very hard for human brains to deal with on a regular basis, so we've been gradually creating more and more programming languages which are abstracted from this. But at the heart of things, it's all zeroes and ones; computers "see" everything in binary, because that's how they were designed.
The first abstracted code language was machine code, where single binary instructions were grouped into hexadecimal (base 16) code groups. We can still see some of these in the various error codes that computers throw at us (anything beginning with an "0x" prefix is a hexadecimal output code from the most basic layers of our computer hardware). The first programming "language" was COBOL, where various English language words were used to represent groups of instructions to the computer on a consistent basis - the easiest way of thinking of "real language" programming languages is they're human-understandable complexes of known combinations of binary decisions for the computer to make.
But right down at the hardware level, it's still all zeroes and ones, switching off and on, taking yes/no choices.
(I should also note: while a great number of computer programmers and computer system engineers love to think that because they understand programming computers, they therefore know everything about how the human brain works, they are very much incorrect. It's been said by neurologists and neuropsychologists that if the human brain were simple enough for us to understand, we wouldn't be complex enough to be able to understand it. Human brains are biological systems shaped by evolution, rather than designed systems created by humans, and our brains - even the most neurodivergent of them - allow us a lot more flexibility in decision making than just "yes" or "no" options.)
(no subject)
Date: 2025-01-29 12:34 am (UTC)Same idea:
* knitting machines, which lift/depress hooks that hold each loop of yarn
* wind-up music boxes, which use tiny steel pins on a round drum
* player pianos, which use paper punch rolls rather than cards tied together
You'll never get a player piano to play it "again, but this time with feeling" -- that's the part only a human can do -- but it will play reliably!
(no subject)
Date: 2025-02-06 10:58 pm (UTC)