You must be good at Math
-
Okay, meta question here: What would a 'connection' that you're willing to accept actually look like? Those I've already presented are what I would call pretty explicit connections between the two fields (and fragmenting this into an explanation of how lambda calculus relies and expands on functional mechanics is going to be a loooong diversion). It's starting to feel like you're pretty entrenched in your initial position, and are just looking for an internet debate.
wrote last edited by [email protected]I wouldn't say entrenched, because I think this is honestly the first time I've seen the two come up together outside of their shared name. I was surprised, but then again sometimes reality is surprising.
Both have function composition, and expressions which contain free variables in multiple places. At the time, that was just a shorthand for what they were trying to express about slight changes. A bit later, formal analysis was axiomised, and is full of infinite things like Cauchy sequences and general topology. In the 20th century, substitution of a composed function into free variables becomes an object of study of it's own, and found to be able to produce full complexity without anything else being added, being Turing equivalent.
All the infinite and continuous stuff that makes calculus work, at least as it's considered abstractly, doesn't really translate into a discrete system. You can numerically approximate it, and I guess you could even use lambda calculus-like functional language to do that, but I'm not mad it never came up in my math courses, like in your original comment.
If there's nothing more to add to that, I am sorry for wasting your time.
-
If you want to know how philosophy works, do sociology...
It's kind of like a horseshoe with philosophy and math at the ends.
wrote last edited by [email protected]If you want to no longer want to know how anything works, do biochemistry
-
If you want to no longer want to know how anything works, do biochemistry
Too real
-
I mean, I graduated over 20 years ago now, but I had to take a number of EE courses for my CS major. Guess that isn't a thing now, or in a lot of places? Just assumed some level of EE knowledge was required for a CS degree this whole time.
wrote last edited by [email protected]In my uni they kinda just teach java. There is one mandatory class that's in C and one that's in mips assembly tho.
Everyone used AI when I took those classes. By the end of the year they were still having trouble on groupchat with syntax stuff.
-
Maybe for dev knowledge, but computer science? The science of computers?
Is that not the difference between a computer science and a computer engineering degree?
-
Well, computer science is not the science of computers, is it? It's about using computers (in the sense of programming them), not about making computers. Making computers is electrical engineering.
We all know how great we IT people are at naming things
wrote last edited by [email protected]My BS in CS took its roots down to CMOS composition of logic gates and basic EE, on the hardware side, and down to deriving numbers and arithmetic from Boolean logic / predicate calculus, on the philosophy side. Then tied those up together through the theoretical underpinnings of computation and problem solving, like a trunk, and branched back out into the various mainstream technologies that derived from all that. It obviously all depends on the program at the school of choice, I suppose, and I'm sure it's evolved over the years, but it still seems important to have at least some courses that pull back the wizard's curtain to ensure their students really see how it's all just an increasingly elaborate, high-tech version of conceptually simple (in function) machinery carrying out fundamental building blocks of logic.
Anyway, I'm going to go sniff my own cinnamon roll scented farts while gazing in the mirror, now.
-
Had a graduate Dev who did not have a fucking clue about anything computer related. How tf he passed his degree I have no idea.
Basic programming principles? No clue. Data structures? Nope.
We were once having a discussion about the limitations of transistors and dude's like "what's a transistor?" ~_~#
I was partnered with that guy for one class in grad school. We were working on a master's degree in software engineering, and the assignment was analysis and changes to an actual code base, and this mofo was asking questions and/or blanking on things like what you mention. I can't remember the specifics but it was some basic building block kind of stuff. Like what's an array, or what's a function, or how do we send another number into this function. I think the neurons storing that info got pruned to save me the frustrating memories.
I just remember my internal emotional reaction. It was sort of "are you fucking kidding me" but not in the sense that somebody blew off the assignment, was rude, or was wrong about some basic fact. I have ADHD and years ago I went through some pretty bad periods with that and overall mental & physical health. I know the panic of being asked to turn in an assignment you never knew existed, or being asked about some project at work and just have no idea whatsoever how to respond.
This was none of those. This was "holy shit, this guy has never done anything, how did he even end up here?"
-
What kind of cs degree did you get where you learned about electrical circuits. The closest to hardware I've learned is logic circuit diagrams and verilog.
I learned about transistors in Informatics class in highschool. Everything from the bottom up, from the material that makes a transistor possible to basic logic circuits sr flip flops, and, or, xor, addition, to the von-neumann-architecture, a basic microprocessor and machine code and assembly.
-
This post did not contain any content.
"Engineer of Information", please
-
Had a graduate Dev who did not have a fucking clue about anything computer related. How tf he passed his degree I have no idea.
Basic programming principles? No clue. Data structures? Nope.
We were once having a discussion about the limitations of transistors and dude's like "what's a transistor?" ~_~#
I've met people like that too.
It's called cheating, lots of people do it.
Most worthless dev I've met was a graduate of comp sci who couldn't hold a candle compared to a guy that did a dev boot camp.
The best dev I've met so far didn't even have any credentials whatsoever, second next best did 2yr associates.
Tie for 3rd best with associate's and 4yr degree.
-
If you want to know how philosophy works, do sociology...
It's kind of like a horseshoe with philosophy and math at the ends.
A horseshoe capped off by Computer Science
-
My BS in CS took its roots down to CMOS composition of logic gates and basic EE, on the hardware side, and down to deriving numbers and arithmetic from Boolean logic / predicate calculus, on the philosophy side. Then tied those up together through the theoretical underpinnings of computation and problem solving, like a trunk, and branched back out into the various mainstream technologies that derived from all that. It obviously all depends on the program at the school of choice, I suppose, and I'm sure it's evolved over the years, but it still seems important to have at least some courses that pull back the wizard's curtain to ensure their students really see how it's all just an increasingly elaborate, high-tech version of conceptually simple (in function) machinery carrying out fundamental building blocks of logic.
Anyway, I'm going to go sniff my own cinnamon roll scented farts while gazing in the mirror, now.
We did the same thing, going so far as to "build" a simple imaginary CPU. It was interesting but ultimately dead knowledge.
I built an emulator for that CPU, which the university course took over and used for a few years for the course. But after that I never did anything with logic gates or anything like that.
I got into DIY electronics lateron as a hobby, but even then I never used logic gates and instead just slapped a cheap microcontroller on to handle all my logic needs.
I do use transistors sometimes e.g. for amplification, but we didn't learn anything about that in university.
In the end it feels like learning how to theoretically mine sand when studying to become an architect. Interesting, but also ultimately pointless.
-
PID control is the classic example, but at a far enough abstraction any looping algorithm can be argued to be an implementation of the concepts underpinning calculus. If you're ever doing any statistical analysis or anything in game design having to do with motion, those are both calculus too. Data science is pure calculus, ground up and injected into your eyeballs, and any string manipulation or Regex is going to be built on lambda calculus (though a very correct argument can be made that literally all computer science is built of lambda calculus so that might be cheating to include it)
Does it apply to interpolation for animation and motion?
-
Does it apply to interpolation for animation and motion?
Motion yes, but I have no idea about the mathematics of animation (sorry)
-
The typical holder of a four-year degree from a decent university, whether it's in "computer science", "datalogy", "data science", or "informatics", learns about 3-5 programming languages at an introductory level and knows about programs, algorithms, data structures, and software engineering. Degrees usually require a bit of discrete maths too: sets, graphs, groups, and basic number theory. They do not necessarily know about computability theory: models & limits of computation; information theory: thresholds, tolerances, entropy, compression, machine learning; foundations for graphics, parsing, cryptography, or other essentials for the modern desktop.
For a taste of the difference, consider English WP's take on computability vs my recent rewrite of the esoteric-languages page, computable. Or compare WP's page on Conway's law to the nLab page which I wrote on Conway's law; it's kind of jaw-dropping that WP has the wrong quote for the law itself and gets the consequences wrong.
Iād honestly be interested where you are from and how it is in other parts of the world. In my country (or at least at my university), we have to learn most of what you described during our bachelors. For us there is not much focus on programming languages though and more about concepts. If you want to learn programming, you are mostly on your own. The theories we learned are a good base though
-
I got my BS in CSci about 15 years ago and it was 100% about programming in java. We didn't learn a fucking thing about hardware and my roommate was an EE major and we had none of the same classes except for calculus.
By the time I graduated java was basically dead. Thanks state college.
Yeah, EE and CS had a lot of cross over where I went. At least in undergrad, grad school saw them diverge a lot more, but they still never disentangled, parts of each were important to both. Hell we had stuff like A+ labs, and shit.
-
If you want to know how computers work, do electrical engineering. If you want to know how electricity works, do physics. If you want to know how physics works, do mathematics. If you want to know how mathematics works, too bad, best you can do is think about the fact it works in philosophy.
all roads lead to philosophy
-
This post did not contain any content.
You are right man
-
I got my BS in CSci about 15 years ago and it was 100% about programming in java. We didn't learn a fucking thing about hardware and my roommate was an EE major and we had none of the same classes except for calculus.
By the time I graduated java was basically dead. Thanks state college.
My CS program had virtually no programming outside a couple of courses where C was used to implement concepts. Had one applications type course where mostly Java was used.
CS is and should be a specialized math curriculum IMO. Teaching specific programming languages is time that would be better spent teaching theory that can't be taught by dev docs or code bootcamps, as exemplified by your anecdote. Unfortunately nowadays people tend to see degrees as glorified job training programs.
-
Well, computer science is not the science of computers, is it? It's about using computers (in the sense of programming them), not about making computers. Making computers is electrical engineering.
We all know how great we IT people are at naming things
Computational theory would be a better name, but it overlaps with a more specific subset of what is normally called CS.