alive_miracles: (Default)
alive_miracles ([personal profile] alive_miracles) wrote in [community profile] little_details2025-01-26 04:23 pm

Computer Language & Terms

Hiya, I'm trying to discover more about the language of computers (coding, etc) as well as the terminology around what makes a computer. I plan to write a story from the perspective of a computer. This device will be a metaphor for being neurodivergent (autism, adhd, ocd, etc) and a state of anarchism.

[1.]

Do any of you know any historical figures, events, or devices that would be relevant to this subject?

[2.]

Any specific words related to computers/coding language that would be a good idea to think of/implement?

[3.]

Know of any websites or people I could get in contact with to help understand and possibly apply accurate knowledge to my story?
rodo: chuck on a roof in winter (Default)

[personal profile] rodo 2025-01-26 10:10 pm (UTC)(link)
I have a bit of trouble trying to figure out how computers could represent a state of anarchism, tbh. Could you elaborate on that? And basically, in general. Your question is extremely open ended.

[1.] You might want to look up von Neumann architecture as a model for basic elements of a computer and how it might conceive of itself.

[2.] Some elements that a lot of modern programming languages have are: variables (think x = 4), functions/methods (blocks of code that can be reused, to keep it short), classes and objects (basically, abstracted models of things or concepts with attributes and functions), if statements (more or less what it sounds like - if (condition) do this, else if (condition 2) do that, else do that), and several types of loops (for, while, do while, for each)

[3.] You could try some free courses on sites such as Codecademy or W3Schools to get a feeling for coding. There's also something called design patterns (basically, ways programmers tend to solve common problems), but that's probably way too advanced for what you're looking for.
rodo: chuck on a roof in winter (Default)

[personal profile] rodo 2025-01-27 06:48 pm (UTC)(link)
No problem! By the way, what you mentioned below - anarchism meaning that the computer prioritizes one functionality over another - actually sounds like a bug. You know, some programmer created the code for dealing with conflicting needs/alternatives and missing that case/implementing it wrong.

But one thing you need to consider in this scenario is that you're dealing with a fantasy/sci-fi scenario anyway. What you're describing requires that a computer be able to analyze/intuit consequences for an individual and such of an action and actually understand it, and computers as they are today... don't understand anything. LLMs might seem like they do, but they're really more like very sophisticated auto-complete features.
octahedrite: elf girl with a slight smile (Default)

[personal profile] octahedrite 2025-01-26 10:40 pm (UTC)(link)

Uh, good luck trying to represent anarchism through computer architecture. You can't do that without magic.

(Anonymous) 2025-01-26 10:42 pm (UTC)(link)
I agree with the other comment that this is very open ended. I'm curious to hear more details.

[1] Computer history is a fascinating and complex topic. To be extremely general, first mathematicians like Ada Lovelace toyed around with different niche ways of doing math (i.e. binary and boolean logic) and they thought about using newly discovered electricity to crunch big numbers. Then Alan Turing came up with a theoretical model known as the Turing machine to describe how any computer works at a low level, he also invented the first digital computer to help crack the enigma code in WW2. In the 60s-90s there was a boom of technology, the computer was refined into what we'd recognize today, standards were ironed out, and the internet was born. One big name from this age was Linus Torvalds, inventor of linux. You can think of linux like a competitor to windows. Then we get into the modern age where everyone gets online and buys computers and you should hopefully know all the major milestones from here.

[2] To understand a computer you must think in levels of abstraction. You can't worry about how efficiently your hardware is running while trying to troubleshoot a YouTube video. I'd say the main layers are hardware, firmware (which includes you operating system), software, and networking. All of these systems work together to bring this message from me to you. I agree taking an introduction to software development course online would be helpful.

[3] On top of learning the basics of coding from w3 schools or geeks for geeks (python is an easy language to learn) I'd recommend lurking on computer/coding focused subreddits. There's a ton and you'll see the lingo in context.

(Anonymous) 2025-01-28 12:10 am (UTC)(link)
I think the easiest way to represent those competing priorities (helping it's patient vs selling their data) is to look at most computer sci-fi stories and invert the whole "kill SOME people to protect all people" logic (see 2001: a space odyssey, I, robot, etc.)

(Anonymous) 2025-01-28 12:16 am (UTC)(link)
I have no mouth and I must scream is a very dark story that covers a robot talking about itself and grappling with dumb given rules.
allekha: Two people with long hair kissing with a heart in the corner (Default)

[personal profile] allekha 2025-01-26 10:42 pm (UTC)(link)
I'm also not sure how computers could be a metaphor for a state of anarchism. The first thing I think of is that computers are actually unable to act by themselves and literal in the extreme (they do exactly what you tell them to do, not what you think you tell them to do, as anyone who has spent hours fixing bugs can attest to).

I would recommend that you get started with a basic programming tutorial (Python is a popular and relatively easy language with tons of available resources) and/or an introduction to computer science and architecture concepts. The definition of Turing machine might also be good groundwork depending on your story - in one of my CS courses, we read his original paper defining the concept, but there's plenty of simpler explanations around.
lilacsigil: 12 Apostles rocks, text "Rock On" (12 Apostles)

[personal profile] lilacsigil 2025-01-27 12:38 am (UTC)(link)
Techno-anarchism is a real thing - using technology to empower self-governing communities or networks. This is a very basic and (I think) overly positive view of it, but it's a good start.

https://www.technoanarchism.org/

(Anonymous) 2025-01-27 12:54 am (UTC)(link)
For all those asking what I would be looking in reference to anarchism this is a good piece of information. Stuff like this!! Tyyy!!
megpie71: Denzel looking at Tifa with a sort of "Huh?" expression (Are you going to tell him?)

[personal profile] megpie71 2025-01-27 01:07 am (UTC)(link)
It's worth noting that computer coding has been gradually getting more and more abstracted from the basics of what is actually happening down at the hardware level.

The first computers were basically fed code as a series of zeroes and ones (switch off, switch on) and at the very heart of things, this is what all computer instructions consist of. Unfortunately, this is very hard for human brains to deal with on a regular basis, so we've been gradually creating more and more programming languages which are abstracted from this. But at the heart of things, it's all zeroes and ones; computers "see" everything in binary, because that's how they were designed.

The first abstracted code language was machine code, where single binary instructions were grouped into hexadecimal (base 16) code groups. We can still see some of these in the various error codes that computers throw at us (anything beginning with an "0x" prefix is a hexadecimal output code from the most basic layers of our computer hardware). The first programming "language" was COBOL, where various English language words were used to represent groups of instructions to the computer on a consistent basis - the easiest way of thinking of "real language" programming languages is they're human-understandable complexes of known combinations of binary decisions for the computer to make.

But right down at the hardware level, it's still all zeroes and ones, switching off and on, taking yes/no choices.

(I should also note: while a great number of computer programmers and computer system engineers love to think that because they understand programming computers, they therefore know everything about how the human brain works, they are very much incorrect. It's been said by neurologists and neuropsychologists that if the human brain were simple enough for us to understand, we wouldn't be complex enough to be able to understand it. Human brains are biological systems shaped by evolution, rather than designed systems created by humans, and our brains - even the most neurodivergent of them - allow us a lot more flexibility in decision making than just "yes" or "no" options.)

(Anonymous) 2025-01-29 12:34 am (UTC)(link)
When thinking about binary code, I like to recall that the first successful punch cards, and indeed the reason we made punch cards for 20th century computers, were the patterns for weaving on jacquard looms in the 1700s. The cards were huge, and all tied together to make a loop, which is how you got a repeating pattern in the weave. This was entirely mechanical, long before electricity was a useful concern -- but when you can reduce your options to a series of yes/no questions (like, which heddles are up/down on any given throw of the shuttle), you can "compute" a pattern in fabric.

Same idea:
* knitting machines, which lift/depress hooks that hold each loop of yarn
* wind-up music boxes, which use tiny steel pins on a round drum
* player pianos, which use paper punch rolls rather than cards tied together

You'll never get a player piano to play it "again, but this time with feeling" -- that's the part only a human can do -- but it will play reliably!
fred_mouse: line drawing of sheep coloured in queer flag colours with dream bubble reading 'dreamwidth' (Default)

[personal profile] fred_mouse 2025-01-27 01:44 am (UTC)(link)

I would add Grace Hopper to your list of people to investigate. Hidden Figures, the book rather than the movie, would also have useful information. And there is a kids non-fiction book about the first computer mouse, called something obvious like The First Computer Mouse, but that might well be out of print.

Upthread, someone suggested Python as a language to investigate. I would like to put in my 2c for R, as having quite a different approach. There are lots of free options for learning, but https://software-carpentry.org/lessons/ is the one I'm most familiar with. Their materials are available online. An alternative would be to look in to html, css, and then if you feel fancy, to try javascript. Making yourself a webpage is one way to get something that makes you feel like you are getting something out of it, and you can get free hosting from neocities (and presumably other places, but that is the one I know about).

In terms of people to contact, no. But there are a lot of free courses available online, and they will at least give you the terminology. You can go narrow, such as one of the languages, or wide, and look for computer science. It is possible that in one of the computer science courses there will be a history of computing unit - if you can find one of those you might find a lot of ideas to follow up. But this is such a big question that it is hard to even pick what ideas to suggest.

birdylion: picture of an exploding firework (Default)

[personal profile] birdylion 2025-01-27 06:20 pm (UTC)(link)
If you want to get a feeling for how coding works without having to learn a specific language, and you like a more gamified approach, I recommend the game "Human Resource Machine". It lets you solve logical/logistical puzzles with typical coding commands and might be a good intro into how to think about information for (basic) programming.
melannen: Commander Valentine of Alpha Squad Seven, a red-haired female Nick Fury in space, smoking contemplatively (Default)

[personal profile] melannen 2025-01-30 07:29 pm (UTC)(link)
If you want a deep understanding of how computing might work well with your metaphor, you might need to wait until *after* you've learned to code.

If you just want some cool terminology and history that makes you sound like a '90s hacker, read the Jargon File and follow up on anything that seems interesting: http://www.catb.org/jargon/html/