Coding (or programming) is a hot topic of conversation lately. MOOCs are offering programming classes [1] for free while industry titans are urging everyone to code [2]. On the other hand, others are objecting to the notion that everyone should become programmers [3]. I think the truth is somewhere in between these two extreme positions.
I personally think coding (or scripting) is similar to writing. Not all of us are good enough, motivated enough, or would even enjoy writing for a living. But because the written and spoken language drives much of the world today, it's important to have a strong grasp of language, be able see how it is being used for or against you and be able to use it to your advantage.
The ability to read and write code is similar. Familiarity with how code moves machines and software is the foundation to understanding how the most powerful man made forces in the world today are controlled and built. Being able to write even simple scripts or simple data filters can multiply your productivity or give you new perspective on how to frame problems and approaches to solutions. Without a basic understanding of code, one will be left behind in today's world.
Without a strong grip on the written and spoken language, we are at mercy of others who wield it better than us. Code is in many ways similar to this. I may never be a a master writer or a master programmer (or even a 'good' one for either), but knowing the basics is invaluable.
Of course, we should note that writing and coding will have "diminishing returns" for most everyone. We should exercise discretion in how far we should take our pursuit and have good judgment on what we expect to get from either discipline.
I personally feel that most people probably won't benefit from learning how to code. Coding is fun and rewarding, but there are few practical situations outside of certain professions. I mean anyone can learn how to do simple I/O and calculation with python, but what are they going to use it for? Fact is, most programs you might want to make are already available for free online, made by people who have coded for a long time.
It's like with most forms of manufacturing. You could go out into the woods and carve some wood and make your own bow... or you could just go out and buy a cheap bow which will probably outperform your homemade bow easily.. and when it comes to coding, you don't even buy anything, everything is supplied for free.
I feel like we should focus on making sure that everyone is very comfortable with computers. I've seen many a horror story from tech support, seems like it would be good to do..
On March 05 2013 02:47 Tobberoth wrote: I personally feel that most people probably won't benefit from learning how to code. Coding is fun and rewarding, but there are few practical situations outside of certain professions. I mean anyone can learn how to do simple I/O and calculation with python, but what are they going to use it for? Fact is, most programs you might want to make are already available for free online, made by people who have coded for a long time.
It's like with most forms of manufacturing. You could go out into the woods and carve some wood and make your own bow... or you could just go out and buy a cheap bow which will probably outperform your homemade bow easily.. and when it comes to coding, you don't even buy anything, everything is supplied for free.
On March 05 2013 02:47 Tobberoth wrote: I personally feel that most people probably won't benefit from learning how to code. Coding is fun and rewarding, but there are few practical situations outside of certain professions. I mean anyone can learn how to do simple I/O and calculation with python, but what are they going to use it for? Fact is, most programs you might want to make are already available for free online, made by people who have coded for a long time.
It's like with most forms of manufacturing. You could go out into the woods and carve some wood and make your own bow... or you could just go out and buy a cheap bow which will probably outperform your homemade bow easily.. and when it comes to coding, you don't even buy anything, everything is supplied for free.
I disagree. If you use a computer at all (which I think nearly everyone does) learning to code even at a basic level will improve your ability to function in the computer realm. It will allow you to understand more how the computer works and become more technologically literate. Everyone shouldn't be out there building huge programs and re-writing websites because there are others out there doing it better, as you point out. But the basic knowledge of how computers work and how programs run allows someone to work better. I hope that you are not advocating that everyone should not need to learn math, because a calculator can do it?
The bigger point is that (at least in the US, not sure for EU or the rest of the world) there is currently a huge deficit of people with programming knowledge in the private sector. Companies are increasingly moving online, and they need -competent- people with these skills. Key word being competent. These skills are not being taught or emphasized in the US schools, and thus the current situation arises. If more people were exposed to coding, perhaps some would be interested and enter into a career in software. Planting the seed is as important, if not more important, than actually providing immediately useful real-life skills
Hm... first of all, the problem we have right now is not that we have too many people venturing into coding and getting frustrated. There is this notion that coding is something elitist, that you have to be some kind of guru to do it. Suggesting coding is not for everyone is like saying that reading or writing is not for everyone; something that was done in the medieval times to discourage people from reading and writing.
The other misconception is that coding is really only necessary in certain professions, i.e. for "Coders". The reality is: Coding helps you in a lot of situations. There is an entire industry of jobs where coding would be beneficial, but it is not taught. The best example are business analysts, who have to sift through tons of data everyday as their day job, but also a lot of secretaries who use software such as Excel on a daily basis. These tools can be scripted, and should be.
Tobberoth, your example is, unfortunately, not sustainable. A lot of manufactured things never require tweaking. Tell me, how many of you guys have ever tweaked your computer? Perhaps you only downloaded a tool to do that for you, and that's fine. But computers are incredibly flexible tools. Tweaking them is kind of necessary to use them at 100% efficiency. But there's a much better reason your 'bow' example does not work: For a long time, manufacturing was hard, and time-consuming. Now that that's changing (think 3D-printers etc.), people do manufacture lots themselves--and for that precise reason, they don't like the manufactured version.
I sometimes cringe when I hear gamers talk about their favourite game software (I'll assume that's SC, what else?), and demonstrate an incredible lack of knowledge. That's another reason I support this claim: If more people would code, more people would have an understanding and appreciation of how software works internally.
Of course not everyone will end up actually working as a programmer. But that's not what this campaign wants to achieve--it's the awareness of it.
'Cuz they are going to spend 80 hours a week working there.
edit: I was a computer science major, so this doesn't fall on deaf ears. I know how to code. But knowing how to code doesn't mean you can look at someone else's and know what the heck it does. It can take a long time to understand.
I think, in addition to the obvious uses for coding, coding is an important skill for any scholar/academic. Not just math professors, but even experts in non-technical areas need to use computers to automate processes. You can often get away with what already exists, but it's not enough anymore.
I am also a programmer. I finished one of the top universities in my country when it comes to computer science. From my personal experience coding is easier than debugging. I have seen people just choke when their code wouldn't work and the error wasn't obvious.
Programming is more about knowing how to handle the situation when the code doesn't work the way you want (in 99% of cases you will have to debug your code), than the actual ability to code.
I recently posted on my fb an offer to help anyone interested in getting started to code. Only 2 people have expressed interest in that, and of them, neither has committed. Maybe the next generation will have the time?
People should at least hear about coding, but I think that knowing how a computer functions is actually what's more important. Everybody should have at least one class in school dealing with how a computer works and what makes it work.
Programming, in my experience, is about knowing what's available to you (language specific) and knowing how to use it well. Also, polymorphism! The first class you design for a professional project should NEVER be the actual implementation of what you want, unless you understand that writing the implementation class leads to understanding the requirements better and being able to abstract it accordingly!
On March 05 2013 03:53 RoyGBiv_13 wrote: I recently posted on my fb an offer to help anyone interested in getting started to code. Only 2 people have expressed interest in that, and of them, neither has committed. Maybe the next generation will have the time?
People are lazy and programming is "hard". The biggest problem though is that people have no connection to programming; they just don't see the point. If someone has no interest in it, it'll be a waste of time trying to teach them. Would you be interested in learning how to ride horses or in learning about interior design when someone posted that on facebook?
I'm a research assistant at my university in the psych department, and my boss had me do this very mundane but also annoying task of looking at two lists of data, and seeing how much time each of the lists had for a certain variable in common. "a" 05 05 "a" 03 03 "b" 05 10 "b" 02 05
and so on. So she expected me to calculate it by hand, over and over again. And there were about 300 files I had to compare. So, thinking like a true programmer (being lazy) I wrote up a program that did exactly that, and I could simply input the filename and I'd have my values instantly. To my knowledge there wasn't any program that could do this, or that could do this as precisely and specifically as I needed, so I luckily could write one myself.
On March 05 2013 03:53 RoyGBiv_13 wrote: I recently posted on my fb an offer to help anyone interested in getting started to code. Only 2 people have expressed interest in that, and of them, neither has committed. Maybe the next generation will have the time?
People are lazy and programming is "hard". The biggest problem though is that people have no connection to programming; they just don't see the point. If someone has no interest in it, it'll be a waste of time trying to teach them. Would you be interested in learning how to ride horses or in learning about interior design when someone posted that on facebook?
You're looking at it from an employment perspective, even though millions of people use computers and aren't employed in the IT field. I think he would liken it to being interested in reading, or knowing how a car works, i.e., more general things that can be applied outside of a livelyhood context.
On March 05 2013 03:45 CreationSoul wrote: I am also a programmer. I finished one of the top universities in my country when it comes to computer science. From my personal experience coding is easier than debugging. I have seen people just choke when their code wouldn't work and the error wasn't obvious.
Programming is more about knowing how to handle the situation when the code doesn't work the way you want (in 99% of cases you will have to debug your code), than the actual ability to code.
Completely agree. I graduated from the most difficult theoretical computer science program in France. While all of us thus know our way through the most complex algorithms, few can actually efficiently implement them, and most importantly, debug and troubleshoot code. Debugging most of the time requires a deep understanding of system architecture and software engineering practice, not just algorithms, language syntax and libraries, which is what schools only teach.
On March 05 2013 04:24 Roe wrote: and so on. So she expected me to calculate it by hand, over and over again. And there were about 300 files I had to compare. So, thinking like a true programmer (being lazy) I wrote up a program that did exactly that, and I could simply input the filename and I'd have my values instantly. To my knowledge there wasn't any program that could do this, or that could do this as precisely and specifically as I needed, so I luckily could write one myself.
This is a great example of what I was talking about in my prior post. Coding is invaluable in all areas of academia.
I think everyone should learn some amount of coding. Even if all you are going to use is Excel, knowing a few basics on scripting can go a long way. On the other hand, not everyone is cut out to be a programmer, just like not everyone is cut out to me a mathematician. However, everyone learns how to do arithmetic in school.
That video just sounds like a whole bunch of tech guys saying: "please we need a greater supply people who code out of love so we can work them harder and pay them less"
On March 05 2013 04:24 Roe wrote: and so on. So she expected me to calculate it by hand, over and over again. And there were about 300 files I had to compare. So, thinking like a true programmer (being lazy) I wrote up a program that did exactly that, and I could simply input the filename and I'd have my values instantly. To my knowledge there wasn't any program that could do this, or that could do this as precisely and specifically as I needed, so I luckily could write one myself.
This is a great example of what I was talking about in my prior post. Coding is invaluable in all areas of academia.
I saw that too... my wife is a teacher and so much of the redundancy involved at her work just breaks my heart... being able to code even some of the simpler routines they do would save so much time
On March 05 2013 03:53 RoyGBiv_13 wrote: I recently posted on my fb an offer to help anyone interested in getting started to code. Only 2 people have expressed interest in that, and of them, neither has committed. Maybe the next generation will have the time?
People are lazy and programming is "hard". The biggest problem though is that people have no connection to programming; they just don't see the point. If someone has no interest in it, it'll be a waste of time trying to teach them. Would you be interested in learning how to ride horses or in learning about interior design when someone posted that on facebook?
You're looking at it from an employment perspective, even though millions of people use computers and aren't employed in the IT field. I think he would liken it to being interested in reading, or knowing how a car works, i.e., more general things that can be applied outside of a livelyhood context.
I have no idea what you are trying to say. I meant teaching them at a basic level, which most people simply don't have an interest in. I've tried. It's the same thing with cars, just that more people are interested in cars and therefore know more about them than they do about computers.
My sister is going through her PhD program for neurobiology (or something along those lines) and one of the requirements is for her to take some programming classes. She has taken some Java so far I believe, not sure if shes taken more.
I am learning to code a little right now while taking all of my basics in college. I will continue the line that our family has with programming.
My grandpa coded (with punchcards no less), my mom codes and I will code.
A WHOLE FAMILY OF PROGRAMMERS
EDIT: BTW, my dad tried programming in college with my mom, it wasnt for him. He had a hard time grasping some of the logic involved and it just didnt work with the way that he thinks.
Programming definitely teaches you a lot of invaluable skills. Logic, problem-solving to name a couple and it definitely enhances creativity though the last one won't be so obvious to people who don't program. ("Durrr...mindless key punching in front of a computer, why not be musician or dancer like me?")
Personally, I love programming as a hobby because it's another medium to create. Just like you can make a painting that you really love and would like to hang on your wall, you can create a game or a video downloader or something like that which you can take pride in. You can make it in any unique way you want. Also, call me insane but I actually find the coding process quite enjoyable.
I took one programming class as an undergrad, in scheme of all languages. I can't stress enough how useful this has been throughout my career, not because I have ever used scheme again, but because knowing how to learn a coding language has meant that when confronted with interesting excel issues, it is trivially easy for me to devise the right expression or formula to solve them. Without the CS class I probably would have looked at all the formulas and been like, "this looks hard, let me ask someone else to do this for me."
Anyway, if comfortability with excel and regular expressions is the only thing you get out of CS, it is totally worth it. Not to mention being able to learn basic html easily (back when websites were actually hand-coded) and lots of other things.
Learning to program is a great way of bolstering your critical thinking skills.
In order to program something useful outside of self-growth, it requires years of study and commitment. Most people can't even commit to diet or their new-years resolutions. Hardly anyone can be a professional programmer due to a lack of resolve.
When people say everyone should learn to program I really see a few skills that we want to teach, and they are skills that I think programmer CAN teach which is also important.
1. Boolean logic, operators and their application. Being able to really understand if, and, or, if and only if, xor, and not in a formal setting is a valuable skill. It's a really applicable skill to a wide array of problems. Likewise with the core concept of the other standard operators like for and while. It's just a way of thinking about the steps of a solution that's pretty handy. 2. Breaking down systems. A big part of programming after all is how to structure and build complex systems or solve complex problems one piece at a time. Again knowing how to do this is going to apply to a lot of things in life. 3. Understand some level of how computers work. Knowing at some level the structure of how a computer works and everything is going to be helpful in today's world. It's pretty worthwhile to say know the difference between a browser, the world wide web, and the Internet and knowing something about comp sci/programming is going to help with that.
Anyways recently I've been reading a book on drawing where the Author makes a pretty compelling point about learning to draw. Her argument is that drawing is a function of the 'right half' of your brain (not necessary the actual right hemisphere, just the type of actions we associate with the right half of your brain). By learning to draw, which is obviously a skill that's not widely applicable to many people, we learn to perceive better and to problem solve using the right half of our brain, a skill that is useful. I think you could make a similar argument about programming. It's NOT useful to most people to be able to program, but the skills you learn by learning to program are widely applicable.
There's also definitely a difference between being a programmer and being a 'software engineer'. Just like all of us can write without being authors or journalists, anyone can program but not everyone is going to become a software engineer.
If someone comes up with another way to teach the skills programming teaches without having to teach program, than that would be just as good. In either case I think the first 2 skills I mentioned are not being taught well in schools, so something like programming should be used to teach those skills.
On March 05 2013 02:41 thedeadhaji wrote: The ability to read and write code is similar. Familiarity with how code moves machines and software is the foundation to understanding how the most powerful man made forces in the world today are controlled and built. Being able to write even simple scripts or simple data filters can multiply your productivity or give you new perspective on how to frame problems and approaches to solutions. Without a basic understanding of code, one will be left behind in today's world.
He wont. Not if design matters, and it does. More than anything else.
See coders in my team at work always lament jokingly over the fact that people only ever see and give praise (or not) over what I do as a designer and the only time they hear from anybody is when things are not working / bugging / crashing. And I always equally jokingly reply that someday you will be replaced by machines anyway.
I respect the art of coding, and coming from a java background before I switched to design I know my fair share about it. I personally resent coders who do design. As I did. Because I know that while coding helps me in fields that require logic and abstract thinking it is hurtful to my creativity. And before you gasp and shout out that coding requires a great deal of creativity let me assure you that I know it does. But only in the realms of coding. Still a bold statement indeed.
In my years I've met and worked with quite a few coders who have great skills and very sharp minds but almost all of them lack the ability to think outside of the box when the box is not meant for them. They maneuver with great flexibility and a wealth of ideas but are almost always terribly restricted when it comes to see beyond the end of their noses. And it's not their fault. It simply is in their way of approaching a problem. They can perfectly explain a program with all its details and obstacles because they solved it and most of the time they are the only ones to understand all of it and are then baffled, almost ignorant of the fact why another mind completely unlike theirs wouldn't want that.
Well obviously because that other mind doesn't understand what it all does! But that's precisely the problem with most coders. Because it's not what it does, it's what it does for them. And by that you render the inner workings irrelevant. So a great deal of you might say they need to understand what it actually all does. I say we need to make it so it doesn't matter any more.
On March 05 2013 02:41 thedeadhaji wrote: The ability to read and write code is similar. Familiarity with how code moves machines and software is the foundation to understanding how the most powerful man made forces in the world today are controlled and built. Being able to write even simple scripts or simple data filters can multiply your productivity or give you new perspective on how to frame problems and approaches to solutions. Without a basic understanding of code, one will be left behind in today's world.
He wont. Not if design matters, and it does. More than anything else.
See coders in my team at work always lament jokingly over the fact that people only ever see and give praise (or not) over what I do as a designer and the only time they hear from anybody is when things are not working / bugging / crashing. And I always equally jokingly reply that someday you will be replaced by machines anyway.
I respect the art of coding, and coming from a java background before I switched to design I know my fair share about it. I personally resent coders who do design. As I did. Because I know that while coding helps me in fields that require logic and abstract thinking it is hurtful to my creativity. And before you gasp and shout out that coding requires a great deal of creativity let me assure you that I know it does. But only in the realms of coding. Still a bold statement indeed.
In my years I've met and worked with quite a few coders who have great skills and very sharp minds but almost all of them lack the ability to think outside of the box when the box is not meant for them. They maneuver with great flexibility and a wealth of ideas but are almost always terribly restricted when it comes to see beyond the end of their noses. And it's not their fault. It simply is in their way of approaching a problem. They can perfectly explain a program with all its details and obstacles because they solved it and most of the time they are the only ones to understand all of it and are then baffled, almost ignorant of the fact why another mind completely unlike theirs wouldn't want that.
Well obviously because that other mind doesn't understand what it all does! But that's precisely the problem with most coders. Because it's not what it does, it's what it does for them. And by that you render the inner workings irrelevant. So a great deal of you might say they need to understand what it actually all does. I say we need to make it so it doesn't matter any more.
The OP seemed to be saying that knowing how to code even just a little will help people who do repetitive tasks increase their output more, along with improving the way they think critically about things. This conclusion was reached given the video he posted from "industry titans" who know that the simplest program has the potential to increase a worker's output dramatically.
Your argument about having the ultimate "facade" that does everything the user wants without requiring the user to know anything about the program I thought was inapplicable because it didn't address the OP's point. It's not realistically feasible either; you cannot predict who your end-user will be and what needs they will have. If you have a program that the user can interact minimally with and have it return the user's expected output, you have basically created Artificial Intelligence.
thedeadhaji is right, "Without a basic understanding of code, one will be left behind in today's world." Take Roe's post, for example. How many times had his boss or the people who had worked for him done that same kind of task? How smart and badass did he seem when he got that job done in record time? How likely is this performance to weigh in on what jobs he gets in the future? I think it's quite likely that his boss will prefer her workers to have a cursory amount of programming knowledge going forward, thusly leaving others behind.
EDIT: Having a work structure that has "designers" and "coders" is also pretty archaic; the best coders are the ones who think about the design of their applications while doing the coding. If you separate the two, each disparate member is going to have a knowledge deficit in what they are employed as.
I agree, I started to try and learn to code seriously for 2/3 weeks or so. I just got bored. I was doing Project Euler but it just took so damn long to solve a problem I kinda gave up. Is it normal to that it takes like 6 hours to solve like one of the first Project Euler problems =.=? I hope I'm not bad at logic, I find Math very interesting. On that note. Does someone have a good first Math book to work through in my vacation? I stumbled upon this one: http://www.trillia.com/zakon1.html. But I don't have any answers for the problems sadly.
Man I wish I got programming, or math in general. Even basic arithmetic.
I originally left high school here in Ireland and headed to SFU in Vancouver to do Interactive Arts and Design, a mix of programming and design that was tailored to getting into the gaming industry (but not limited to that). Sadly as soon as it came to my programming class I realized that I just couldn't handle it. After some very basic hello world type stuff in python I got lost. I mean, programming and logic aside I'm a guy who didn't remember the code to the apartment block I lived in for a year - just the shape my hand was meant to move in (I always got mixed up between 7402, 7042 or 7204. Or even other combinations), mis-remember phone numbers constantly and really struggle with counting out change quickly .
The entire framework of thinking when it came to coding just completely left me at a loss, causing much keyboard banging, frustration and wasted hours figuring out how to get my coded mouse to find the cheese in my coded maze. Then came the midterm where I walked out 20mins in, dropped out of the class, essentially ending my dream of making da vidya games. When I think of the careers and opportunities that I could take advantage of with a CS degree I cry inside Much respect to the code monkeys who actually make things work around these her internet parts
On March 05 2013 03:53 RoyGBiv_13 wrote: I recently posted on my fb an offer to help anyone interested in getting started to code. Only 2 people have expressed interest in that, and of them, neither has committed. Maybe the next generation will have the time?
People are lazy and programming is "hard". The biggest problem though is that people have no connection to programming; they just don't see the point. If someone has no interest in it, it'll be a waste of time trying to teach them. Would you be interested in learning how to ride horses or in learning about interior design when someone posted that on facebook?
You're looking at it from an employment perspective, even though millions of people use computers and aren't employed in the IT field. I think he would liken it to being interested in reading, or knowing how a car works, i.e., more general things that can be applied outside of a livelyhood context.
I have no idea what you are trying to say. I meant teaching them at a basic level, which most people simply don't have an interest in. I've tried. It's the same thing with cars, just that more people are interested in cars and therefore know more about them than they do about computers.
The biggest response I get is that people are afraid that programming is "one of those skills where the more you learn, the more you realize how much more you have to know." I cannot dispute that, but I did point out that programming also has immediate impact, so you won't have to learn it all for you to begin using it. In fact, your second or third program is usually the first time you "scratch your own itch" and solve a problem you had.
I'll agree that people just aren't interested in learning it as a "basic" skill. Maybe we should change that? Anyone else want to make their own offer to their social graph?
As someone who has no background in coding at all (ie: thinks ruby and python are stuff you find in a store and a zoo),but nevertheless has a facination with the whole tech startup culture, I feel like an outsider with his nose pressed against the window looking in. Theres a guy at the door smiling and telling me to come in, it's not scary at all and it'll be great fun! He looks like this;
I really like the OP's analogy of comparing coding to writing. But we all know how highschool English is like a gas chamber where the desire to write go to die. As someone who is interested but hesistant to learn to program, I've taken a look at most of the free "learn to code" options out there (ie:code acadamy,udcity,khan etc). What I want to know from actual programmers is, if you were to learn from scratch or had to teach a relative from scratch, will you use those options? Are there better ones? Would you teach them yourselves instead so you can accelerate their learning and keep them interested with better projects than "how to make pong".
On March 05 2013 02:41 thedeadhaji wrote: I personally think coding (or scripting) is similar to writing. Not all of us are good enough, motivated enough, or would even enjoy writing for a living. But because the written and spoken language drives much of the world today, it's important to have a strong grasp of language, be able see how it is being used for or against you and be able to use it to your advantage.
The ability to read and write code is similar. Familiarity with how code moves machines and software is the foundation to understanding how the most powerful man made forces in the world today are controlled and built. Being able to write even simple scripts or simple data filters can multiply your productivity or give you new perspective on how to frame problems and approaches to solutions. Without a basic understanding of code, one will be left behind in today's world.
I really like how you wrote that. It's almost like if you were to visit another country and know nothing of their native language. Not even such things as, where is the bathroom, or may I buy some food.
Nice crosspost from HN, but I really would have liked it if you had included a quote/blurb from the "Programming is not for Everybody" post... YouTube embeds have a way of suppressing everything else in a post (people won't look at anything else), for better or worse.
I feel that a lot of comments and the video itself rely on too many generalizations. What is "coding?". Some 2 line bash script that saves you the 10 seconds of typing out some commonly used commands? A 1000 line two week project in C for your operating systems class? Some web page or document written in [insert markup language here]? The next-great videogame that takes years of development by a team of thousands? Sure, getting people to "code" is great. But what is "coding"? Does everyone need to learn every kind of "coding" there is? I'm sure Chris Bosh had a lot of fun "coding in college."
Getting people to "code" is an empty goal. Teaching people to be curious and work on projects that require an understanding of programming languages is much more meaningful.
I'm deeply troubled by this idea that coding is suddenly the most important thing that anyone could learn
I suppose this is not really different from my general beef that our cultural attitude towards education has tunnel vision for STEM
On March 05 2013 13:21 Loser777 wrote: Getting people to "code" is an empty goal. Teaching people to be curious and work on projects that require an understanding of programming languages is much more meaningful.
I don't think everyone can learn to code or that everyone even wants to. It goes a bit further than just knowing addition and your multiplication tables if you want to compete with the kids who started early. While I'm sure a lot of programmers go on to management positions eventually and never code again they would have made it to management regardless.
It is true that all of those things exist, but it’s definitely the exception, not the norm, for a programmer. Most offices come equipped with a coffee machine, and a fridge to put your own food, often crowded with other people’s food that has been sitting there all week. They are in office buildings with gray cubicles, and they are fueled by stringent bureaucracy.
On March 05 2013 13:21 Loser777 wrote: Getting people to "code" is an empty goal. Teaching people to be curious and work on projects that require an understanding of programming languages is much more meaningful.
well said
I've never heard anyone say coding is the most important thing anyone could learn. Even within the comp sci department they say to keep your view on the whole picture of what you're designing and think critically about what you're implementing from an algorithmic approach, rather than focusing on learning one specific language. In fact all the teaching at my school is done language independent, so that you grasp the concepts and then can apply it to what you like. It's not philosophy but it's a useful and generalizable thing to learn and add to your skill set. But then again I go to a small liberal arts school
Besides the face that it's lets your practice your problem solving skills...
For the most part, programming in general is useless for most people, especially those outside in the field of mathematics, engineering, computers, and science. However, it's one of the more friendly methods to convey and manifest an idea or concept into a consistent and logical message compared to say learning proofs in mathematics or going through a myriad of algebraic equations in physics.
It's easy to think of something that seems logical to yourself. It's harder to convey it's logical to others, inanimate or not. It's hardest to convince yourself that sometimes, that seemingly logical idea isn't logical after all.
Programming is one of the ways to practice such manifestation of logical and consistent ideas.
On March 05 2013 13:22 sam!zdat wrote: I'm deeply troubled by this idea that coding is suddenly the most important thing that anyone could learn
I suppose this is not really different from my general beef that our cultural attitude towards education has tunnel vision for STEM
On March 05 2013 13:21 Loser777 wrote: Getting people to "code" is an empty goal. Teaching people to be curious and work on projects that require an understanding of programming languages is much more meaningful.
well said
I've never heard anyone say coding is the most important thing anyone could learn. Even within the comp sci department they say to keep your view on the whole picture of what you're designing and think critically about what you're implementing from an algorithmic approach, rather than focusing on learning one specific language. In fact all the teaching at my school is done language independent, so that you grasp the concepts and then can apply it to what you like. It's not philosophy but it's a useful and generalizable thing to learn and add to your skill set. But then again I go to a small liberal arts school
But all of that is coding though. Coding isn't just about learning a language.
On March 05 2013 13:21 Loser777 wrote: I feel that a lot of comments and the video itself rely on too many generalizations. What is "coding?". Some 2 line bash script that saves you the 10 seconds of typing out some commonly used commands? A 1000 line two week project in C for your operating systems class? Some web page or document written in [insert markup language here]? The next-great videogame that takes years of development by a team of thousands? Sure, getting people to "code" is great. But what is "coding"? Does everyone need to learn every kind of "coding" there is? I'm sure Chris Bosh had a lot of fun "coding in college."
All of it. Just like how arithmetic is math. High school algebra is math. Calculus is math. Partial differential equation is still math. Not everyone knows all of these, but everyone is expected to learn up to at least high school algebra. (Personally I want them to stress logic/probability/statistics more, as they are more useful, but that's another story all together.)
I would like the CS classes in schools provide more understanding of how computer systems work. This would save a lot of nerves of people who wreck havoc when they can't use some devices properly (blaming it all on the device of course).
Probably, teaching coding might be a good start
On the other hand, I can't see how the programming skills would help f.e. a cashier in a supermarket (assuming she started working as a cashier, and then signed for the classes). He/she won't probably won't see any connection between the code and the thing he/she does every day (because there is a hell of a long road between a simple code and a cash systems working).
The idea looks fine, but the problem is in the complexity of the whole computer world.
Coding is awful and boring if all you do is file i/o or math.
However, once I learned computer graphics, suddenly I could create anything I wanted. Starcraft? League of Legends? Halo? Computer graphics is the empowering factor.
When I was a beginner, I was learning C++ and I wanted to put a pixel on the screen, not the stupid console window. A year later and a couple hundred lines of DirectX (no beginner is going to know how to link to a library without help, let alone understand what the fuck its doing) and I drew a model from Assassins Creed. Talk about barrier of entry.
Maybe it is me, but I see 3 pages of comments that can be summed up as "Coding is useful", and I really don't chalk that up to a real discussion about the subject.
Speaking another language is useful, and so is knowing how to build a fire if you are lost in the woods overnight. Does that mean that everyone should be introduced to these skills?
In this regard, yes coding is useful, but should everyone be introduced to coding? Of course this discussion has been one sided as it is being discussed by computer literate folks (being on Teamliquid), but I'm surprised that I'm not seeing the other side of the argument yet.
Anyhow, being older than the crowd here, I will pull the "time is finite" card, and simply say that not all skills are worth the time being invested in them. Sure there have been examples of using coding to decrease the time to complete a project 10-fold (I have had my share at work too), but ultimately if the company has a group of programmers to tackle these issues, the part of the project which needs that work will be thrown to them.
The flip side is that if you are working at a start-up with 10 people, then yeah, it would be nice to have more self sufficient workers. But then again, if you have 2 programmers in the group, why would the other 8 need the skill? Likewise, if 2 people knew how to speak 3-4 languages, why would the others need those skills?
My point is, that it is not feasible to be specialized in a skillset (which is a requirement in society today), and also have working knowledge of all other skills. Time is finite, and it just isn't worth it.
(looking forward to seeing a more in depth discussion of this)
-EDIT- As a background I'm an engineer and I code often.
Being able to code allows you to see solutions to problems other people blissfully ignore or futilely endure. If more people could code, more problems might get solved and the world might be a better place.
Imagine a world were everyone could fix bugs rather than just filing them.
On March 05 2013 04:32 Shady Sands wrote: That video just sounds like a whole bunch of tech guys saying: "please we need a greater supply people who code out of love so we can work them harder and pay them less"
Sorry if this is a bit off-topic but I was wondering what the general thought on what web-development falls under. I know some programmers that don't consider anything web-based as programming or coding, while others have a more mixed approach. In the broadest sense, it is programming in that you are telling a machine (a web browser) what to do, but it doesn't really do any heavy lifting like real programming does. To further muddy the waters there are things like php that are entirely web-based, but can be used to do "real" programming if needed.
I was wondering what everyone else's take on it was. As a web developer/designer I don't consider myself a "programmer" (even though I have taken programming courses), as that title can get pretty elitist, but I'm not sure what else it is besides "web developer," if anything.
On March 06 2013 01:58 HardlyNever wrote: Sorry if this is a bit off-topic but I was wondering what the general thought on what web-development falls under. I know some programmers that don't consider anything web-based as programming or coding, while others have a more mixed approach. In the broadest sense, it is programming in that you are telling a machine (a web browser) what to do, but it doesn't really do any heavy lifting like real programming does. To further muddy the waters there are things like php that are entirely web-based, but can be used to do "real" programming if needed.
I was wondering what everyone else's take on it was. As a web developer/designer I don't consider myself a "programmer" (even though I have taken programming courses), as that title can get pretty elitist, but I'm not sure what else it is besides "web developer," if anything.
Bullshit. Following that logic only people that write machine code are real programmers.. because all other languages are just simplified versions of it... or maybe i should say that when you write c++ you are scripting, because it is all just simplified machine code :D.... ah well, im not a programmer so i might be wrong.
On March 06 2013 01:58 HardlyNever wrote: Sorry if this is a bit off-topic but I was wondering what the general thought on what web-development falls under. I know some programmers that don't consider anything web-based as programming or coding, while others have a more mixed approach. In the broadest sense, it is programming in that you are telling a machine (a web browser) what to do, but it doesn't really do any heavy lifting like real programming does. To further muddy the waters there are things like php that are entirely web-based, but can be used to do "real" programming if needed.
I was wondering what everyone else's take on it was. As a web developer/designer I don't consider myself a "programmer" (even though I have taken programming courses), as that title can get pretty elitist, but I'm not sure what else it is besides "web developer," if anything.
JavaScript and PHP are programming languages. HTML and CSS are not.
When you use a programming language, you are a programmer.
These are facts.
In my experience, "web designer" can span anything from HTML, CSS, and JavaScript (and visual design, creation of assets, etc.) while "web developer" often means that you can handle a wider set of responsibilities including the any server side development. However, as with all job titles, there is no "real" boundary.
On March 06 2013 01:58 HardlyNever wrote: Sorry if this is a bit off-topic but I was wondering what the general thought on what web-development falls under. I know some programmers that don't consider anything web-based as programming or coding, while others have a more mixed approach. In the broadest sense, it is programming in that you are telling a machine (a web browser) what to do, but it doesn't really do any heavy lifting like real programming does. To further muddy the waters there are things like php that are entirely web-based, but can be used to do "real" programming if needed.
I was wondering what everyone else's take on it was. As a web developer/designer I don't consider myself a "programmer" (even though I have taken programming courses), as that title can get pretty elitist, but I'm not sure what else it is besides "web developer," if anything.
Bullshit. Following that logic only people that write machine code are real programmers.. because all other languages are just simplified versions of it... or maybe i should say that when you write c++ you are scripting, because it is all just simplified machine code :D.... ah well, im not a programmer so i might be wrong.
On March 06 2013 01:58 HardlyNever wrote: Sorry if this is a bit off-topic but I was wondering what the general thought on what web-development falls under. I know some programmers that don't consider anything web-based as programming or coding, while others have a more mixed approach. In the broadest sense, it is programming in that you are telling a machine (a web browser) what to do, but it doesn't really do any heavy lifting like real programming does. To further muddy the waters there are things like php that are entirely web-based, but can be used to do "real" programming if needed.
I was wondering what everyone else's take on it was. As a web developer/designer I don't consider myself a "programmer" (even though I have taken programming courses), as that title can get pretty elitist, but I'm not sure what else it is besides "web developer," if anything.
Bullshit. Following that logic only people that write machine code are real programmers.. because all other languages are just simplified versions of it... or maybe i should say that when you write c++ you are scripting, because it is all just simplified machine code :D.... ah well, im not a programmer so i might be wrong.
On March 06 2013 01:58 HardlyNever wrote: Sorry if this is a bit off-topic but I was wondering what the general thought on what web-development falls under. I know some programmers that don't consider anything web-based as programming or coding, while others have a more mixed approach. In the broadest sense, it is programming in that you are telling a machine (a web browser) what to do, but it doesn't really do any heavy lifting like real programming does. To further muddy the waters there are things like php that are entirely web-based, but can be used to do "real" programming if needed.
I was wondering what everyone else's take on it was. As a web developer/designer I don't consider myself a "programmer" (even though I have taken programming courses), as that title can get pretty elitist, but I'm not sure what else it is besides "web developer," if anything.
Bullshit. Following that logic only people that write machine code are real programmers.. because all other languages are just simplified versions of it... or maybe i should say that when you write c++ you are scripting, because it is all just simplified machine code :D.... ah well, im not a programmer so i might be wrong.
He's right when he says as a web developer he doesn't consider himself to be a "programmer." There are a few reasons why: 1. Basic reason: look at how much a Java programmer makes compared to a front-end developer... I know I make at least 30,000 (no joke) more than the web devs at my work 2. Web languages aren't compiled languages, meaning there is a limit on how much of an individual machine those languages have control over 3. You don't need to know hardcore algorithms to solve problems as a webdev, and you aren't concerned with machine performance since the big company web-browsers are your platforms 4. Similarly, you don't need to know design philosophy, any design patterns, how to unit test, application deployment, database structure (normalization, etc.), application integration, etc., since all of that is pre-defined as a robust back-end application+database (on most big-time front-end applications)
On March 06 2013 01:58 HardlyNever wrote: Sorry if this is a bit off-topic but I was wondering what the general thought on what web-development falls under. I know some programmers that don't consider anything web-based as programming or coding, while others have a more mixed approach. In the broadest sense, it is programming in that you are telling a machine (a web browser) what to do, but it doesn't really do any heavy lifting like real programming does. To further muddy the waters there are things like php that are entirely web-based, but can be used to do "real" programming if needed.
I was wondering what everyone else's take on it was. As a web developer/designer I don't consider myself a "programmer" (even though I have taken programming courses), as that title can get pretty elitist, but I'm not sure what else it is besides "web developer," if anything.
Bullshit. Following that logic only people that write machine code are real programmers.. because all other languages are just simplified versions of it... or maybe i should say that when you write c++ you are scripting, because it is all just simplified machine code :D.... ah well, im not a programmer so i might be wrong.
He's right when he says as a web developer he doesn't consider himself to be a "programmer." There are a few reasons why: 1. Basic reason: look at how much a Java programmer makes compared to a front-end developer... I know I make at least 30,000 (no joke) more than the web devs at my work 2. Web languages aren't compiled languages, meaning there is a limit on how much of an individual machine those languages have control over 3. You don't need to know hardcore algorithms to solve problems as a webdev, and you aren't concerned with machine performance since the big company web-browsers are your platforms 4. Similarly, you don't need to know design philosophy, any design patterns, how to unit test, application deployment, database structure (normalization, etc.), application integration, etc., since all of that is pre-defined as a robust back-end application+database (on most big-time front-end applications)
I agree with #2-3. #1 I couldn't really comment much on because a. I'm a recent graduate and b. I'm actually a librarian with web development experience so I handle the website for the library. In the hierarchy of the university I'm a librarian(professional), not a web developer, I just happen to be both in reality.
#4 I have to disagree with, but that might be more of my personal experience or having to work within libraries in general. I need to understand database structure as well as language (I do primarily SQL databases) in order to work in the environment I work in. But again, that might be more unique to libraries. You could probably get away with web development in general without that knowledge/experience.
As I see it, coding teaches two things, and does so very well. If you can find something else that gets across the same points as effectively, then by all means push for that. The first is linear problem thinking. Given a set of steps, can you follow the logic and work out what the result is. In the other direction, given a goal, can you create sub-goals and a set of steps to reach each one, and eventually accomplish the larger goal.
The second big idea, one post mentioned as "polymorphism" but I prefer to call it abstraction. Given an individual problem, can you try to find a more general class of problems it belongs to, and instead of solving each one individually, solve them as a group - or at least come up with a general recipe for the particular solutions.
On March 06 2013 01:58 HardlyNever wrote: Sorry if this is a bit off-topic but I was wondering what the general thought on what web-development falls under. I know some programmers that don't consider anything web-based as programming or coding, while others have a more mixed approach. In the broadest sense, it is programming in that you are telling a machine (a web browser) what to do, but it doesn't really do any heavy lifting like real programming does. To further muddy the waters there are things like php that are entirely web-based, but can be used to do "real" programming if needed.
I was wondering what everyone else's take on it was. As a web developer/designer I don't consider myself a "programmer" (even though I have taken programming courses), as that title can get pretty elitist, but I'm not sure what else it is besides "web developer," if anything.
I view even something like writing excel scripts as coding, so web development definitely falls under coding as well. On the other hand, I don't think of the web designer as programmers, as they requires somewhat of a different skill set, including artistic talents. Perhaps, one can view it as the difference between an engineer and a mathematician? Both of them uses plenty of math, but you won't ever call them the same job.
On March 05 2013 19:13 dontgonearthecastle wrote: On the other hand, I can't see how the programming skills would help f.e. a cashier in a supermarket (assuming she started working as a cashier, and then signed for the classes). He/she won't probably won't see any connection between the code and the thing he/she does every day (because there is a hell of a long road between a simple code and a cash systems working).
It might help her find work when the store switches to self-checkout or some RFID based system in a few years and lays off all the cashiers.
On March 05 2013 19:13 dontgonearthecastle wrote: On the other hand, I can't see how the programming skills would help f.e. a cashier in a supermarket (assuming she started working as a cashier, and then signed for the classes). He/she won't probably won't see any connection between the code and the thing he/she does every day (because there is a hell of a long road between a simple code and a cash systems working).
It might help her find work when the store switches to self-checkout or some RFID based system in a few years and lays off all the cashiers.
but why is "learn to code" the only "useful" thing anyone can think of to learn about these days? It's depressing. I don't want us to become a "nation of coders", that's super lame. Anyway, how many coding jobs can there really be? You gonna soak up all the technological unemployment with coding? I'm skeptical.
I feel like my CS Professor wants to discourage us from delving into programming. Not literally but I mean I think he means to filter out people who were just curious about computer science and those who have a real knack for it. Some of the problems we tackle for homework gives me so much grief. I literally sit around for hours trying debug my program, only to have 3 more lined up behind that I need to complete before the night is over.
On March 06 2013 06:40 Snuggles wrote: I feel like my CS Professor wants to discourage us from delving into programming. Not literally but I mean I think he means to filter out people who were just curious about computer science and those who have a real knack for it. Some of the problems we tackle for homework gives me so much grief. I literally sit around for hours trying debug my program, only to have 3 more lined up behind that I need to complete before the night is over.
On March 05 2013 19:13 dontgonearthecastle wrote: On the other hand, I can't see how the programming skills would help f.e. a cashier in a supermarket (assuming she started working as a cashier, and then signed for the classes). He/she won't probably won't see any connection between the code and the thing he/she does every day (because there is a hell of a long road between a simple code and a cash systems working).
It might help her find work when the store switches to self-checkout or some RFID based system in a few years and lays off all the cashiers.
but why is "learn to code" the only "useful" thing anyone can think of to learn about these days?
You will have to ask someone who agrees with that statement.
Anyway, how many coding jobs can there really be?
No idea. Probably not many in the traditional sense of being a 'programer'. But I could see a future where most jobs require some coding, or close cooperation with someone who can code.
You gonna soak up all the technological unemployment with coding? I'm skeptical.
I'm not gonna do anything. These things are happening for better or worse. I suspect people who claim to know if the net effect is going to be positive or negative (and many people are making very clear predictions of a coming Golden age or impending social collapse) are actually making stuff up.
On March 06 2013 01:58 HardlyNever wrote: Sorry if this is a bit off-topic but I was wondering what the general thought on what web-development falls under. I know some programmers that don't consider anything web-based as programming or coding, while others have a more mixed approach. In the broadest sense, it is programming in that you are telling a machine (a web browser) what to do, but it doesn't really do any heavy lifting like real programming does. To further muddy the waters there are things like php that are entirely web-based, but can be used to do "real" programming if needed.
I was wondering what everyone else's take on it was. As a web developer/designer I don't consider myself a "programmer" (even though I have taken programming courses), as that title can get pretty elitist, but I'm not sure what else it is besides "web developer," if anything.
Bullshit. Following that logic only people that write machine code are real programmers.. because all other languages are just simplified versions of it... or maybe i should say that when you write c++ you are scripting, because it is all just simplified machine code :D.... ah well, im not a programmer so i might be wrong.
He's right when he says as a web developer he doesn't consider himself to be a "programmer." There are a few reasons why: 1. Basic reason: look at how much a Java programmer makes compared to a front-end developer... I know I make at least 30,000 (no joke) more than the web devs at my work 2. Web languages aren't compiled languages, meaning there is a limit on how much of an individual machine those languages have control over 3. You don't need to know hardcore algorithms to solve problems as a webdev, and you aren't concerned with machine performance since the big company web-browsers are your platforms 4. Similarly, you don't need to know design philosophy, any design patterns, how to unit test, application deployment, database structure (normalization, etc.), application integration, etc., since all of that is pre-defined as a robust back-end application+database (on most big-time front-end applications)
Your reasons make no sense... 1) Your pay doesn't certify your capability. I'm sure there are web developers in the world that would earn 30K more than you. 2) Since when does your tools define your roles? Especially in modern age where 99% of programming is done within application frameworks (either in house or standards) and memory management. 3) Agreeable to a point but I think if you disqualify people based on complexity then you would need to disqualify just about everyone in the programming scene except for maybe the top 10%. Especially with web technologies being essential and prevalent in today's IT landscape. 4) What? No design pattern/unit test/data structure? what the? I'm not even a web dev but I know this is not true, the quality of web people you come across must be very low indeed...
When I first started programming in highschool, it was intimidating. see all the weird notations, numbers and whatnot like <include>, while(conditions){, if(conditions) and blah blah blahmade me think programming is hard because I have to remember so much crap. And then I have to worry about the actual programming afterwards! god this will kll me.
Not the case.
As a college student, I spend more time figuring out why the flip my program doesn't work. I'm spending more time breaking down what I need to do and work towards it and put everything together.
The programming is easy but long, just takes alot of time. Its the thinking and problem solving behind programming is what is challenging.
and when you do figure out the problem, you just feel so Einstein.
On March 05 2013 02:47 Tobberoth wrote: I personally feel that most people probably won't benefit from learning how to code. Coding is fun and rewarding, but there are few practical situations outside of certain professions. I mean anyone can learn how to do simple I/O and calculation with python, but what are they going to use it for? Fact is, most programs you might want to make are already available for free online, made by people who have coded for a long time.
It's like with most forms of manufacturing. You could go out into the woods and carve some wood and make your own bow... or you could just go out and buy a cheap bow which will probably outperform your homemade bow easily.. and when it comes to coding, you don't even buy anything, everything is supplied for free.
This is like saying people shouldn't try to learn algebra and calculus in school if they aren't going to become physicists or economists or whatever. Or that they shouldn't study philosophy if they aren't going to be a humanities major/lawyer/stuffy academic.
Learning to program teaches people problem solving and language comprehension in the abstract.
Most everything you could want is ready made if you can beg, borrow or steal. If you ever want to make or do something of your own, it helps to have fundamental skills that go beyond computing.
On March 05 2013 02:41 thedeadhaji wrote: But because the written and spoken language drives much of the world today, it's important to have a strong grasp of language, be able see how it is being used for or against you and be able to use it to your advantage.
The ability to read and write code is similar.
I'm having trouble with this. Programming languages allow us to define rigorous abstract machines, while human language is often used "for or against us" in very wishy-washy, inexact ways (think legalese). One of the first things you will realize when working as a professional programmer is that you are bound by things like truth and practicality, but the marketing people who direct your company are not. C can't lie but English can.
One example would be the capacity to understand and think critically about how "this thing in my hand" (phone) can record, track, and send data to some 3rd party against my will and what data we're giving up, what means we have against it if we disagree with it. At some point you'd want to think about how the data is being handled internally, and that gets to the code level of the device. Even without programming knowledge you can get to a point of "hey, I don't like what this thing is doing or can potentially do", but in order to think of "what can I do against it? What can be done differently", you start to need working knowledge of what is going on inside the guts of the machine.
For instance I deal with Android a lot in my day job, and you start to realize that while the current system is built in a certain way, there are many other ways in which the same, or similar functionalities can be realized through alternate implementations from the hardware layer through the driver, HAL, framework, and application layer.
Totally agree. I really believe that we'll see programming as a pretty basic set of skills in future schooling.
I program even though I'm a graphic designer. It empowers me because I can make rapid prototypes etc. But I do believe that anyone can benefit from learning it. It's a certain logical way of thinking which gives people a mental model for dealing with all the digital stuff in the world today. And I find that it is a very good teacher for learning how to problem-solve in general.
Plus you're using a computer for its intended purpose! Hooray for universal machines.
As a technical writer most of our writing these days becomes produced as an XML along with a publishing server and stylesheets for easier output of different formats. Aside from the fact that knowing coding better would make it easier for me to decipher customer/dev/QE needs, I actually end up needing it as a technical skill for producing/fixing stylesheet errors, improving our technology, and fixing/knowing how the server works. You don't NEED to know it, but obviously if you know it and someone else doesn't, one person is more likely to be laid off than the other, and it both broadens your horizon in terms of work and also makes you more of an expert in your work field. This is just how our world is - technology is progressing, and you need to know how to use it to its fullest if you want to get ahead...
On March 05 2013 04:24 Roe wrote: I'm a research assistant at my university in the psych department, and my boss had me do this very mundane but also annoying task of looking at two lists of data, and seeing how much time each of the lists had for a certain variable in common. "a" 05 05 "a" 03 03 "b" 05 10 "b" 02 05
and so on. So she expected me to calculate it by hand, over and over again. And there were about 300 files I had to compare. So, thinking like a true programmer (being lazy) I wrote up a program that did exactly that, and I could simply input the filename and I'd have my values instantly. To my knowledge there wasn't any program that could do this, or that could do this as precisely and specifically as I needed, so I luckily could write one myself.
STATA, SAS, R, JMP - any statistics program really - can do that for you. Or perhaps I am not understanding the task you were given.
And no, writing commands in any of the abovementioned programs is imo not coding. I am able to do it without any formalized training, nor have I had any IT-courses. I am your run off the mill M.D. who admittedly grew up with a computer from age 5.
EDIT: To briefly comment on the OP. It would be a waste of time for someone like me to learn actual coding. I treat humans, not machines. What good would it do me to be able to write C? Coding seems to be the new buzz-education and whilst I do believe it will be more widespread in the future I honestly doubt that we are going to see any coding language being taught in primary school any time soon. Why do you think Apple are so popular? It is because of the intuitive use - allowing any dimwit to pick up an iPhone and navigate it fairly successfully.
On March 05 2013 04:24 Roe wrote: I'm a research assistant at my university in the psych department, and my boss had me do this very mundane but also annoying task of looking at two lists of data, and seeing how much time each of the lists had for a certain variable in common. "a" 05 05 "a" 03 03 "b" 05 10 "b" 02 05
and so on. So she expected me to calculate it by hand, over and over again. And there were about 300 files I had to compare. So, thinking like a true programmer (being lazy) I wrote up a program that did exactly that, and I could simply input the filename and I'd have my values instantly. To my knowledge there wasn't any program that could do this, or that could do this as precisely and specifically as I needed, so I luckily could write one myself.
STATA, SAS, R, JMP - any statistics program really - can do that for you. Or perhaps I am not understanding the task you were given.
And no, writing commands in any of the abovementioned programs is imo not coding. I am able to do it without any formalized training, nor have I had any IT-courses. I am your run off the mill M.D. who admittedly grew up with a computer from age 5.
EDIT: To briefly comment on the OP. It would be a waste of time for someone like me to learn actual coding. I treat humans, not machines. What good would it do me to be able to write C? Coding seems to be the new buzz-education and whilst I do believe it will be more widespread in the future I honestly doubt that we are going to see any coding language being taught in primary school any time soon. Why do you think Apple are so popular? It is because of the intuitive use - allowing any dimwit to pick up an iPhone and navigate it fairly successfully.
And what format would those those programs need your data to be in? That's something that someone with little coding would likely forget to think about. In some cases even though you have plenty of programs that do the same thing, you may need to completely reformat your data so that those programs could do their thing. For small things like that it's often easier to write your own program to do it than to figure out how to reformat your entire collection of files to a format that some other program can handle.
Like many people have said, programming has benefits for various different domains. On one hand, programming will teach you more about the hardware you are working with, whether it's a phone, a laptop, or a specialized piece of equipment. Beyond that though, programming has significantly improved my problem solving skills.
The real core of what programming actually teaches you is not how to get from point A to point B in some arcane language but instead, how to break a problem down into manageable pieces, and how to approach a problem that at first glance seems far too complex or difficult to solve. It's critical thinking in it's most distilled form.
Programming's problem solving is very useful with computers in general. Once you've learned a bit, all the mysterious errors, or confusing instructions for that new program you just bought suddenly make sense. And if they don't, you have a framework of knowledge you can rely on to try to figure them out.
On March 05 2013 04:24 Roe wrote: I'm a research assistant at my university in the psych department, and my boss had me do this very mundane but also annoying task of looking at two lists of data, and seeing how much time each of the lists had for a certain variable in common. "a" 05 05 "a" 03 03 "b" 05 10 "b" 02 05
and so on. So she expected me to calculate it by hand, over and over again. And there were about 300 files I had to compare. So, thinking like a true programmer (being lazy) I wrote up a program that did exactly that, and I could simply input the filename and I'd have my values instantly. To my knowledge there wasn't any program that could do this, or that could do this as precisely and specifically as I needed, so I luckily could write one myself.
STATA, SAS, R, JMP - any statistics program really - can do that for you. Or perhaps I am not understanding the task you were given.
And no, writing commands in any of the abovementioned programs is imo not coding. I am able to do it without any formalized training, nor have I had any IT-courses. I am your run off the mill M.D. who admittedly grew up with a computer from age 5.
EDIT: To briefly comment on the OP. It would be a waste of time for someone like me to learn actual coding. I treat humans, not machines. What good would it do me to be able to write C? Coding seems to be the new buzz-education and whilst I do believe it will be more widespread in the future I honestly doubt that we are going to see any coding language being taught in primary school any time soon. Why do you think Apple are so popular? It is because of the intuitive use - allowing any dimwit to pick up an iPhone and navigate it fairly successfully.
And what format would those those programs need your data to be in? That's something that someone with little coding would likely forget to think about. In some cases even though you have plenty of programs that do the same thing, you may need to completely reformat your data so that those programs could do their thing. For small things like that it's often easier to write your own program to do it than to figure out how to reformat your entire collection of files to a format that some other program can handle.
Like many people have said, programming has benefits for various different domains. On one hand, programming will teach you more about the hardware you are working with, whether it's a phone, a laptop, or a specialized piece of equipment. Beyond that though, programming has significantly improved my problem solving skills.
The real core of what programming actually teaches you is not how to get from point A to point B in some arcane language but instead, how to break a problem down into manageable pieces, and how to approach a problem that at first glance seems far too complex or difficult to solve. It's critical thinking in it's most distilled form.
Programming's problem solving is very useful with computers in general. Once you've learned a bit, all the mysterious errors, or confusing instructions for that new program you just bought suddenly make sense. And if they don't, you have a framework of knowledge you can rely on to try to figure them out.
What are you trying to argue? The guy stated that no program could compare 2 lists of data which is quite obviously not correct. Yes of course you should think about compatibility when creating a database, but I surely do not expect that to be necessary to a discussion pertaining to a research institute.
Whilst programming is a nice tool set you are hyping it up to be way more useful than it actually is - which is a classic fault when having specialized (or huge interest) in a field. Are you for instance going to study cellular chemistry because it will teach you more about how your body works? Let me answer that for you: The vast majority of the population does not - same as with programming.
Critical thinking is present in every single logically rooted subject out there and is by no means limited to programming, nor is programming an example of a higher level of critical thinking than for instance math. If you want to use improved critical thinking as an argument you are going to have to argue that one should study EVERYTHING - which is a notion I as an academic person do not disagree with, but on a practical level I realize that is impossible. The reality of the modern world is that every field is getting so specialized that you have to put in a disproportionate amount of time for what is very often marginal gains. There is a reason why we no longer see multischolars like Aristotle.
Please note that I am not denying the existence of the merits of programming. I am simply arguing that it is not worth it for everyone to spend the time and that the gains can be very marginal to non-existing in a lot of cases.
EDIT: For clarification: I am essentially reiterating the point made by thedeadhadji - the diminishing returns exist. And furthermore, seeing how hard a time we have teaching the kids of today reading and writing I think teaching them coding is of lesser concern.
I code because I have no artistic talents, I can't draw/paint/model worth shit.
Writing code is easy, deciding how to architect the software is the hard part. Working in teams and dealing with others can be tricky too. Many programmers/coworkers have inflated egos, there are just too many badasses in this world...
On March 06 2013 12:09 TheSwedishFan wrote: I think we should learn how to cook instead. A more valueable skill to have.
I kind of agree. Sure, coding can be fun and a good hobby for you with great gains and benefits especially if it sparks an interest to aim for a profession, but often it's not magically a better skill to learn than many others. I don't think that understanding how computers work or how to program makes you a lot better at using them or using some programs. On average learning it to a decent level consumes a lot of time as well, especially if you're at a stage where you don't use computers proficiently in advance(I would think, just some personal opinion). If you're studying economics and are considering between basics of programming and a good course on oral communication skills, I would recommend taking the latter (though both are good if you got enough time ).
Coding isn't for everyone. But I think you guys are reading too much into the youtube video. I see it as an attempt to try to fight some of the social stigmatization (at least here in the states) of showing interest in computing at a young age.
On March 05 2013 04:32 Shady Sands wrote: That video just sounds like a whole bunch of tech guys saying: "please we need a greater supply people who code out of love so we can work them harder and pay them less"
I agree in that I don't think Bill Gates really hides his motives in public anyway. He's spoken about the labor dynamics several times in the past. That being said, I think this is innocent enough a video to encourage kids to become more interested in programming.
Propaganda like this (ultimately similarly motivated) pop up all the time on TV for other fields like math and hard science.
The profit motives are "offset" by the social "good" of these types of videos. God knows we need more engineer - software or otherwise. There are enough lawyers, bankers, economists and consultants in this country.
On March 06 2013 01:58 HardlyNever wrote: Sorry if this is a bit off-topic but I was wondering what the general thought on what web-development falls under. I know some programmers that don't consider anything web-based as programming or coding, while others have a more mixed approach. In the broadest sense, it is programming in that you are telling a machine (a web browser) what to do, but it doesn't really do any heavy lifting like real programming does. To further muddy the waters there are things like php that are entirely web-based, but can be used to do "real" programming if needed.
I was wondering what everyone else's take on it was. As a web developer/designer I don't consider myself a "programmer" (even though I have taken programming courses), as that title can get pretty elitist, but I'm not sure what else it is besides "web developer," if anything.
Bullshit. Following that logic only people that write machine code are real programmers.. because all other languages are just simplified versions of it... or maybe i should say that when you write c++ you are scripting, because it is all just simplified machine code :D.... ah well, im not a programmer so i might be wrong.
He's right when he says as a web developer he doesn't consider himself to be a "programmer." There are a few reasons why: 1. Basic reason: look at how much a Java programmer makes compared to a front-end developer... I know I make at least 30,000 (no joke) more than the web devs at my work 2. Web languages aren't compiled languages, meaning there is a limit on how much of an individual machine those languages have control over 3. You don't need to know hardcore algorithms to solve problems as a webdev, and you aren't concerned with machine performance since the big company web-browsers are your platforms 4. Similarly, you don't need to know design philosophy, any design patterns, how to unit test, application deployment, database structure (normalization, etc.), application integration, etc., since all of that is pre-defined as a robust back-end application+database (on most big-time front-end applications)
Your reasons make no sense... 1) Your pay doesn't certify your capability. I'm sure there are web developers in the world that would earn 30K more than you. 2) Since when does your tools define your roles? Especially in modern age where 99% of programming is done within application frameworks (either in house or standards) and memory management. 3) Agreeable to a point but I think if you disqualify people based on complexity then you would need to disqualify just about everyone in the programming scene except for maybe the top 10%. Especially with web technologies being essential and prevalent in today's IT landscape. 4) What? No design pattern/unit test/data structure? what the? I'm not even a web dev but I know this is not true, the quality of web people you come across must be very low indeed...
Sure they do 1) Across the board back-end devs make more than front-end... if there is a web dev that makes more than me, there is a back-end dev at his work that makes more than him, that's the comparison to be made 2) Analogy: The training/education that goes into designing cars is more advanced than the training/education that goes into constructing them. Furthermore, the training/education that goes into constructing them is more advanced than the training/education that goes into driving them. It is very much the same with programming. The knowledge requirement for Web development is somewhere between the construction and driving section of the car analogy. 3) You agreed for the most part, but to what you stated let me say this: Poorly designed web applications still work. It's because the basic structure has already been set up (i.e., the browser already knows how to draw frames, you just have to hook yours up to it). Most of the time the people designing these poor applications don't do it in a way that allow it to change or scale easily, and it's usually do to not knowing how to design the application (lack of design pattern understanding) 4) All devs I work with are very qualified and competent; in fact there are a few who are no longer with us because they couldn't cut it. I stand by my statement though, you don't need to know much of #4 to be a successful web developer. Knowing them makes you a good web developer, but it's just a natural fact that there are fewer UI Design patterns (stuff a webdev should know) compared to the vast array of Creational, Structural, Architectural, Behavioral, etc. design patterns (stuff a backend programmer should know). It's mostly because you can only do so much with scripts and parsers; Object Oriented Design, which is not even touched by webdevs, carries with it a bunch of design patterns that are simply not applicable in a scripting world. I am aware that in some web languages you can mock domain objects, but in truth you are basically told what the model is and therefore aren't applying any sort of design to it.
You will have to explain to me what type of web dev you are referring to, it looks like you are talking about the types that build static or basic dynamic eCommerce sites using drupal/php.
Web applications like stackoverflow / reddit / facebook / twitter are done by web developers as well, I have a hard time believing those mammoth applications can be done without the knowledge you listed.
On March 05 2013 04:24 Roe wrote: and so on. So she expected me to calculate it by hand, over and over again. And there were about 300 files I had to compare. So, thinking like a true programmer (being lazy) I wrote up a program that did exactly that, and I could simply input the filename and I'd have my values instantly. To my knowledge there wasn't any program that could do this, or that could do this as precisely and specifically as I needed, so I luckily could write one myself.
This is a great example of what I was talking about in my prior post. Coding is invaluable in all areas of academia.
Yeah, my boss wanted a graph showing a collection of data that was in a ton of seperate files. This would've taken a long time to do manually, so I simply wrote a little program that read all the files and showed the data in a nice graph.
On March 06 2013 01:58 HardlyNever wrote: Sorry if this is a bit off-topic but I was wondering what the general thought on what web-development falls under. I know some programmers that don't consider anything web-based as programming or coding, while others have a more mixed approach. In the broadest sense, it is programming in that you are telling a machine (a web browser) what to do, but it doesn't really do any heavy lifting like real programming does. To further muddy the waters there are things like php that are entirely web-based, but can be used to do "real" programming if needed.
I was wondering what everyone else's take on it was. As a web developer/designer I don't consider myself a "programmer" (even though I have taken programming courses), as that title can get pretty elitist, but I'm not sure what else it is besides "web developer," if anything.
Bullshit. Following that logic only people that write machine code are real programmers.. because all other languages are just simplified versions of it... or maybe i should say that when you write c++ you are scripting, because it is all just simplified machine code :D.... ah well, im not a programmer so i might be wrong.
He's right when he says as a web developer he doesn't consider himself to be a "programmer." There are a few reasons why: 1. Basic reason: look at how much a Java programmer makes compared to a front-end developer... I know I make at least 30,000 (no joke) more than the web devs at my work 2. Web languages aren't compiled languages, meaning there is a limit on how much of an individual machine those languages have control over 3. You don't need to know hardcore algorithms to solve problems as a webdev, and you aren't concerned with machine performance since the big company web-browsers are your platforms 4. Similarly, you don't need to know design philosophy, any design patterns, how to unit test, application deployment, database structure (normalization, etc.), application integration, etc., since all of that is pre-defined as a robust back-end application+database (on most big-time front-end applications)
Your reasons make no sense... 1) Your pay doesn't certify your capability. I'm sure there are web developers in the world that would earn 30K more than you. 2) Since when does your tools define your roles? Especially in modern age where 99% of programming is done within application frameworks (either in house or standards) and memory management. 3) Agreeable to a point but I think if you disqualify people based on complexity then you would need to disqualify just about everyone in the programming scene except for maybe the top 10%. Especially with web technologies being essential and prevalent in today's IT landscape. 4) What? No design pattern/unit test/data structure? what the? I'm not even a web dev but I know this is not true, the quality of web people you come across must be very low indeed...
Sure they do 1) Across the board back-end devs make more than front-end... if there is a web dev that makes more than me, there is a back-end dev at his work that makes more than him, that's the comparison to be made 2) Analogy: The training/education that goes into designing cars is more advanced than the training/education that goes into constructing them. Furthermore, the training/education that goes into constructing them is more advanced than the training/education that goes into driving them. It is very much the same with programming. The knowledge requirement for Web development is somewhere between the construction and driving section of the car analogy. 3) You agreed for the most part, but to what you stated let me say this: Poorly designed web applications still work. It's because the basic structure has already been set up (i.e., the browser already knows how to draw frames, you just have to hook yours up to it). Most of the time the people designing these poor applications don't do it in a way that allow it to change or scale easily, and it's usually do to not knowing how to design the application (lack of design pattern understanding) 4) All devs I work with are very qualified and competent; in fact there are a few who are no longer with us because they couldn't cut it. I stand by my statement though, you don't need to know much of #4 to be a successful web developer. Knowing them makes you a good web developer, but it's just a natural fact that there are fewer UI Design patterns (stuff a webdev should know) compared to the vast array of Creational, Structural, Architectural, Behavioral, etc. design patterns (stuff a backend programmer should know). It's mostly because you can only do so much with scripts and parsers; Object Oriented Design, which is not even touched by webdevs, carries with it a bunch of design patterns that are simply not applicable in a scripting world. I am aware that in some web languages you can mock domain objects, but in truth you are basically told what the model is and therefore aren't applying any sort of design to it.
As a primarily back-end developer and founder of a software company (with experience in the industry before founding the company), i wholeheartedly disagree. You are probably talking about people who do nothing but make static html/css or skin psd's.
I do know, that most back-end devs think that 1,2 and 4 are true. They are wrong and have overinflated egos about how hard architecting backend systems is. I dont know if you understand some of the challenges that web developers face (the ones who work on huuuugely used applications). They have to be full-stack developers. And they are paid accordingly. Chumps, on the other hand, are compensated as chumps. Whether they are backend developers that do nothing but follow a spec in a waterfall-style shop or frontend developers who are given menial design-focused tasks.
Im actually thinking of learning a language. I see mixed reviews on what language to learn...The argument always seems to be between ruby and python...but others say learn Java and so forth. I''m most interested in building a few apps for personal use and possibly some simple scripts for work. Anyone have any suggestions?
On March 05 2013 22:17 Smoot wrote: Maybe it is me, but I see 3 pages of comments that can be summed up as "Coding is useful", and I really don't chalk that up to a real discussion about the subject.
Speaking another language is useful, and so is knowing how to build a fire if you are lost in the woods overnight. Does that mean that everyone should be introduced to these skills?
In this regard, yes coding is useful, but should everyone be introduced to coding? Of course this discussion has been one sided as it is being discussed by computer literate folks (being on Teamliquid), but I'm surprised that I'm not seeing the other side of the argument yet.
Anyhow, being older than the crowd here, I will pull the "time is finite" card, and simply say that not all skills are worth the time being invested in them. Sure there have been examples of using coding to decrease the time to complete a project 10-fold (I have had my share at work too), but ultimately if the company has a group of programmers to tackle these issues, the part of the project which needs that work will be thrown to them.
The flip side is that if you are working at a start-up with 10 people, then yeah, it would be nice to have more self sufficient workers. But then again, if you have 2 programmers in the group, why would the other 8 need the skill? Likewise, if 2 people knew how to speak 3-4 languages, why would the others need those skills?
My point is, that it is not feasible to be specialized in a skillset (which is a requirement in society today), and also have working knowledge of all other skills. Time is finite, and it just isn't worth it.
(looking forward to seeing a more in depth discussion of this)
-EDIT- As a background I'm an engineer and I code often.
Well a skill's value is determined by how apparent it is in society and everyday life. Making a fire is badass but pretty useless because how many times do you have to survive in a forest nowadays? Turn that around and how many people are in contact with & work with computers on a daily basis? I agree that it's not for everyone. But it's not only about the actual programming per se, it's more than that - it's a way of thinking that definitely improves certain mental capabilities.
btw I personally see it more as scripting, since 'real' programming often takes too much technical knowledge whereas scripting is accessible for most and still provides the same mental excercise
On March 07 2013 12:43 Darpa wrote: Im actually thinking of learning a language. I see mixed reviews on what language to learn...The argument always seems to be between ruby and python...but others say learn Java and so forth. I''m most interested in building a few apps for personal use and possibly some simple scripts for work. Anyone have any suggestions?
Do python, it's a scripting language but very wide range of use for it. Don't go down something like Java, you won't utilize it's power if all you want is simple.
On March 07 2013 12:43 Darpa wrote: Im actually thinking of learning a language. I see mixed reviews on what language to learn...The argument always seems to be between ruby and python...but others say learn Java and so forth. I''m most interested in building a few apps for personal use and possibly some simple scripts for work. Anyone have any suggestions?
Do a language that works at whatever you'll spend time with. Something that you have to do for school or work is always good.
On March 07 2013 12:43 Darpa wrote: Im actually thinking of learning a language. I see mixed reviews on what language to learn...The argument always seems to be between ruby and python...but others say learn Java and so forth. I''m most interested in building a few apps for personal use and possibly some simple scripts for work. Anyone have any suggestions?
Do python, it's a scripting language but very wide range of use for it. Don't go down something like Java, you won't utilize it's power if all you want is simple.
can you build basic apps with python? at this point im so clueless as to the difference I have no idea. Interwebs suggest ruby for apps, but Python for working in the Computer world... so the amount of scripting I do will be almost none, it will more be if I can figure something out at work although its not required by my job. its just something im interested in.
Of course it can. You will have to define what you mean by apps though, is it a web application? a mobile application? a desktop GUI application? a background service?
Look at the list of interest areas that Python is supported in.
Ruby is good also but there are not much traction if you want to build a basic app with GUI (It can be done but not newb friendly), the most popular usage for ruby is in the ruby on rails framework (Websites, web applications)
You will want to match the language to what you are doing.
Scratch is a language by MIT for beginners to learn. I watched a ted talk about it but it looks like it's not very powerful. As in it can't do shit. Python can be used for desktop apps but I think objective C is used more for GUI apps on the PC and C# on mobile devices. Afaik Ruby is used for websites as haduken says.
On March 06 2013 01:58 HardlyNever wrote: Sorry if this is a bit off-topic but I was wondering what the general thought on what web-development falls under. I know some programmers that don't consider anything web-based as programming or coding, while others have a more mixed approach. In the broadest sense, it is programming in that you are telling a machine (a web browser) what to do, but it doesn't really do any heavy lifting like real programming does. To further muddy the waters there are things like php that are entirely web-based, but can be used to do "real" programming if needed.
I was wondering what everyone else's take on it was. As a web developer/designer I don't consider myself a "programmer" (even though I have taken programming courses), as that title can get pretty elitist, but I'm not sure what else it is besides "web developer," if anything.
Bullshit. Following that logic only people that write machine code are real programmers.. because all other languages are just simplified versions of it... or maybe i should say that when you write c++ you are scripting, because it is all just simplified machine code :D.... ah well, im not a programmer so i might be wrong.
He's right when he says as a web developer he doesn't consider himself to be a "programmer." There are a few reasons why: 1. Basic reason: look at how much a Java programmer makes compared to a front-end developer... I know I make at least 30,000 (no joke) more than the web devs at my work 2. Web languages aren't compiled languages, meaning there is a limit on how much of an individual machine those languages have control over 3. You don't need to know hardcore algorithms to solve problems as a webdev, and you aren't concerned with machine performance since the big company web-browsers are your platforms 4. Similarly, you don't need to know design philosophy, any design patterns, how to unit test, application deployment, database structure (normalization, etc.), application integration, etc., since all of that is pre-defined as a robust back-end application+database (on most big-time front-end applications)
Your reasons make no sense... 1) Your pay doesn't certify your capability. I'm sure there are web developers in the world that would earn 30K more than you. 2) Since when does your tools define your roles? Especially in modern age where 99% of programming is done within application frameworks (either in house or standards) and memory management. 3) Agreeable to a point but I think if you disqualify people based on complexity then you would need to disqualify just about everyone in the programming scene except for maybe the top 10%. Especially with web technologies being essential and prevalent in today's IT landscape. 4) What? No design pattern/unit test/data structure? what the? I'm not even a web dev but I know this is not true, the quality of web people you come across must be very low indeed...
Sure they do 1) Across the board back-end devs make more than front-end... if there is a web dev that makes more than me, there is a back-end dev at his work that makes more than him, that's the comparison to be made 2) Analogy: The training/education that goes into designing cars is more advanced than the training/education that goes into constructing them. Furthermore, the training/education that goes into constructing them is more advanced than the training/education that goes into driving them. It is very much the same with programming. The knowledge requirement for Web development is somewhere between the construction and driving section of the car analogy. 3) You agreed for the most part, but to what you stated let me say this: Poorly designed web applications still work. It's because the basic structure has already been set up (i.e., the browser already knows how to draw frames, you just have to hook yours up to it). Most of the time the people designing these poor applications don't do it in a way that allow it to change or scale easily, and it's usually do to not knowing how to design the application (lack of design pattern understanding) 4) All devs I work with are very qualified and competent; in fact there are a few who are no longer with us because they couldn't cut it. I stand by my statement though, you don't need to know much of #4 to be a successful web developer. Knowing them makes you a good web developer, but it's just a natural fact that there are fewer UI Design patterns (stuff a webdev should know) compared to the vast array of Creational, Structural, Architectural, Behavioral, etc. design patterns (stuff a backend programmer should know). It's mostly because you can only do so much with scripts and parsers; Object Oriented Design, which is not even touched by webdevs, carries with it a bunch of design patterns that are simply not applicable in a scripting world. I am aware that in some web languages you can mock domain objects, but in truth you are basically told what the model is and therefore aren't applying any sort of design to it.
As a primarily back-end developer and founder of a software company (with experience in the industry before founding the company), i wholeheartedly disagree. You are probably talking about people who do nothing but make static html/css or skin psd's.
I do know, that most back-end devs think that 1,2 and 4 are true. They are wrong and have overinflated egos about how hard architecting backend systems is. I dont know if you understand some of the challenges that web developers face (the ones who work on huuuugely used applications). They have to be full-stack developers. And they are paid accordingly. Chumps, on the other hand, are compensated as chumps. Whether they are backend developers that do nothing but follow a spec in a waterfall-style shop or frontend developers who are given menial design-focused tasks.
Fair enough, though it seems you're giving me a wishlist of what a webdev is supposed to be. The reality is quite different in everything but small startups, which I'm quite familiar with, where most developers assume multiple development roles. Also, I'm sure that most webdevs aren't good webdevs. There are tons of chumps out there, and it's easier to be a chump hired as a webdev then it is a chump hired as a backend developer. 1. I'm not going to send you a link comparing webdev salaries and, say, Java developer median salaries. Just google it. There's a reality regarding the situation, and then there's your opinion of it. 2. I don't know how you are arguing this fact. What is it easier to get hired as, a webdev or a systems architect? Which requires more schooling/knowledge to get hired? 3. I see you didn't dispute this 4. Has absolutely nothing to do with ego. It's simple reasoning; webdevs don't compile or build, they don't deploy, they don't manage repositories, they don't manage database structures, they don't model, they don't design meat and potatoes services, and they don't use a language that models data structures as objects.
Some webdevs are very good at their job and are capable at "programming across the stack" (a common job posting phrase), but in general I'm pretty sure my statements are true and not anecdotal
On March 05 2013 05:58 Logo wrote: When people say everyone should learn to program I really see a few skills that we want to teach, and they are skills that I think programmer CAN teach which is also important.
1. Boolean logic, operators and their application. Being able to really understand if, and, or, if and only if, xor, and not in a formal setting is a valuable skill. It's a really applicable skill to a wide array of problems. Likewise with the core concept of the other standard operators like for and while. It's just a way of thinking about the steps of a solution that's pretty handy. 2. Breaking down systems. A big part of programming after all is how to structure and build complex systems or solve complex problems one piece at a time. Again knowing how to do this is going to apply to a lot of things in life. 3. Understand some level of how computers work. Knowing at some level the structure of how a computer works and everything is going to be helpful in today's world. It's pretty worthwhile to say know the difference between a browser, the world wide web, and the Internet and knowing something about comp sci/programming is going to help with that.
Another one I feel like you learn is "don't be afraid to look up how to do what you want to do [on the internet]" Since so much of programming is figuring out HOW you want to approach a problem (and since it's generally done on a computer anyway) it avoids a stigma that "looking up an answer is cheating" because once you have the approach to the problem down, you've done the important thinking part, and you're just looking up the execution.
I think there is a distinction between learning what coding and programming is and learning how to code and program. In much the same way that in this day and age, people need to know how to use a computer to be employable, but they don't have to understand how and why the computer works the way it does.
Software construction, architecture, and engineering are deep fields that many people will never need to know how to deal with. But, just like someone should learn how to change a flat tire or change the oil of their car, people should probably learn what programming is and how to interface with it in appropriate contexts.
programming really gets interesting when you are actually coding stuff instead of just inputting shit or copying what other people have been coding, and if you are memorising a lot of syntax etc. it gets boring fast as well.
On March 07 2013 23:51 obesechicken13 wrote: Scratch is a language by MIT for beginners to learn. I watched a ted talk about it but it looks like it's not very powerful. As in it can't do shit. Python can be used for desktop apps but I think objective C is used more for GUI apps on the PC and C# on mobile devices. Afaik Ruby is used for websites as haduken says.
You've got it the other way around, C# is used more in GUI on PC (WPF/Winform), but it isn't limited to that. If you are doing C#, you are going to leverage .NET and its vast number of frameworks and libraries. You don't have to do it in C#, you can do it with any languages in the .NET CLR.
There are almost no presence of C# on mobile unless we are counting Windows Phone or Ximian / Mono
If you know the C syntax and need to do GUI app quickly for PC, C# will probably get you there very quickly.
Object C for all intends and purposes are exclusive for the apple ecosystem.
On March 07 2013 23:51 obesechicken13 wrote: Scratch is a language by MIT for beginners to learn. I watched a ted talk about it but it looks like it's not very powerful. As in it can't do shit. Python can be used for desktop apps but I think objective C is used more for GUI apps on the PC and C# on mobile devices. Afaik Ruby is used for websites as haduken says.
You've got it the other way around, C# is used more in GUI on PC (WPF/Winform), but it isn't limited to that. If you are doing C#, you are going to leverage .NET and its vast number of frameworks and libraries. You don't have to do it in C#, you can do it with any languages in the .NET CLR.
There are almost no presence of C# on mobile unless we are counting Windows Phone or Ximian / Mono
If you know the C syntax and need to do GUI app quickly for PC, C# will probably get you there very quickly.
Object C for all intends and purposes are exclusive for the apple ecosystem.
On March 08 2013 04:12 Takkara wrote: I think there is a distinction between learning what coding and programming is and learning how to code and program. In much the same way that in this day and age, people need to know how to use a computer to be employable, but they don't have to understand how and why the computer works the way it does.
Software construction, architecture, and engineering are deep fields that many people will never need to know how to deal with. But, just like someone should learn how to change a flat tire or change the oil of their car, people should probably learn what programming is and how to interface with it in appropriate contexts.
Correct me if I'm wrong, but I took the OP's statement to mean "if you know how to code at a very basic level, you can accomplish more in whatever you do." I don't think anyone is advocating all people learning to code and changing/deploying business programs or getting coder jobs!
On March 07 2013 18:18 haduken wrote: Of course it can. You will have to define what you mean by apps though, is it a web application? a mobile application? a desktop GUI application? a background service?
Look at the list of interest areas that Python is supported in.
Ruby is good also but there are not much traction if you want to build a basic app with GUI (It can be done but not newb friendly), the most popular usage for ruby is in the ruby on rails framework (Websites, web applications)
You will want to match the language to what you are doing.
Thanks for the info, was looking for Mobile apps for the most part.
On March 06 2013 01:58 HardlyNever wrote: Sorry if this is a bit off-topic but I was wondering what the general thought on what web-development falls under. I know some programmers that don't consider anything web-based as programming or coding, while others have a more mixed approach. In the broadest sense, it is programming in that you are telling a machine (a web browser) what to do, but it doesn't really do any heavy lifting like real programming does. To further muddy the waters there are things like php that are entirely web-based, but can be used to do "real" programming if needed.
I was wondering what everyone else's take on it was. As a web developer/designer I don't consider myself a "programmer" (even though I have taken programming courses), as that title can get pretty elitist, but I'm not sure what else it is besides "web developer," if anything.
Bullshit. Following that logic only people that write machine code are real programmers.. because all other languages are just simplified versions of it... or maybe i should say that when you write c++ you are scripting, because it is all just simplified machine code :D.... ah well, im not a programmer so i might be wrong.
He's right when he says as a web developer he doesn't consider himself to be a "programmer." There are a few reasons why: 1. Basic reason: look at how much a Java programmer makes compared to a front-end developer... I know I make at least 30,000 (no joke) more than the web devs at my work 2. Web languages aren't compiled languages, meaning there is a limit on how much of an individual machine those languages have control over 3. You don't need to know hardcore algorithms to solve problems as a webdev, and you aren't concerned with machine performance since the big company web-browsers are your platforms 4. Similarly, you don't need to know design philosophy, any design patterns, how to unit test, application deployment, database structure (normalization, etc.), application integration, etc., since all of that is pre-defined as a robust back-end application+database (on most big-time front-end applications)
Your reasons make no sense... 1) Your pay doesn't certify your capability. I'm sure there are web developers in the world that would earn 30K more than you. 2) Since when does your tools define your roles? Especially in modern age where 99% of programming is done within application frameworks (either in house or standards) and memory management. 3) Agreeable to a point but I think if you disqualify people based on complexity then you would need to disqualify just about everyone in the programming scene except for maybe the top 10%. Especially with web technologies being essential and prevalent in today's IT landscape. 4) What? No design pattern/unit test/data structure? what the? I'm not even a web dev but I know this is not true, the quality of web people you come across must be very low indeed...
Sure they do 1) Across the board back-end devs make more than front-end... if there is a web dev that makes more than me, there is a back-end dev at his work that makes more than him, that's the comparison to be made 2) Analogy: The training/education that goes into designing cars is more advanced than the training/education that goes into constructing them. Furthermore, the training/education that goes into constructing them is more advanced than the training/education that goes into driving them. It is very much the same with programming. The knowledge requirement for Web development is somewhere between the construction and driving section of the car analogy. 3) You agreed for the most part, but to what you stated let me say this: Poorly designed web applications still work. It's because the basic structure has already been set up (i.e., the browser already knows how to draw frames, you just have to hook yours up to it). Most of the time the people designing these poor applications don't do it in a way that allow it to change or scale easily, and it's usually do to not knowing how to design the application (lack of design pattern understanding) 4) All devs I work with are very qualified and competent; in fact there are a few who are no longer with us because they couldn't cut it. I stand by my statement though, you don't need to know much of #4 to be a successful web developer. Knowing them makes you a good web developer, but it's just a natural fact that there are fewer UI Design patterns (stuff a webdev should know) compared to the vast array of Creational, Structural, Architectural, Behavioral, etc. design patterns (stuff a backend programmer should know). It's mostly because you can only do so much with scripts and parsers; Object Oriented Design, which is not even touched by webdevs, carries with it a bunch of design patterns that are simply not applicable in a scripting world. I am aware that in some web languages you can mock domain objects, but in truth you are basically told what the model is and therefore aren't applying any sort of design to it.
As a primarily back-end developer and founder of a software company (with experience in the industry before founding the company), i wholeheartedly disagree. You are probably talking about people who do nothing but make static html/css or skin psd's.
I do know, that most back-end devs think that 1,2 and 4 are true. They are wrong and have overinflated egos about how hard architecting backend systems is. I dont know if you understand some of the challenges that web developers face (the ones who work on huuuugely used applications). They have to be full-stack developers. And they are paid accordingly. Chumps, on the other hand, are compensated as chumps. Whether they are backend developers that do nothing but follow a spec in a waterfall-style shop or frontend developers who are given menial design-focused tasks.
Front-end web developers can easily make more than their back-end counterparts, especially when it comes to mobile. Developing rich browser based mobile applications is a complex task, and it DOES involve object orients design, and a complete understanding of the back-end powering it, and everything in between. In the mobile world as well, front-end development is not simply taking a PSD and turning it into HTML, it's taking a few dozen PSDs, UX designer's notes, API documentation, etc... and creating a rich interactive application, all within the confines of the browser, and have it look and work exactly the same across multiple platforms. Similar job titles are javascript engineer, or front-end engineer, companies looking for someone to do this work always ask for full stack development experience, whereas back-end developers don't typically require front-end experience.
Most back-end development is fairly menial, changing database schema, building a simple AJAX API, problems that have been solved hundreds of times before, and anyone who can code could look up and implement one of the available solutions very easily. Front-end development hasn't been figured out nearly as much, due to the fact that browsers have been changing so much as of late, new features such as Canvas and websockets have yet to be fully explored, so if you run into a problem that would have been impossible 9 months ago, chances are it hasn't been solved yet, or the current solution isn't optimized or adapted to match the currently available tools. This means a lot more creative problem solving, and a lot less implementation of already solved problems.
As far as the analogies go, i'd say they're pretty flawed. Web development (either front or back-end) doesn't not involve design at all, that's the web designers job. If you actually think about it, building a car to spec is pretty freaking easy, if the schematics have been provided to you, all you have to do is put all the parts together. They also require completely different skill sets, you don't need to understand how the thing you're building works, but you need to know how to operate the machinery, build the parts to spec, deal with quality control, design an assembly line system to ensure efficiency, etc... etc..
If you were to transfer this to web development, that's like saying the web designer hands you a photoshop mockup that has all information required to complete the web development, it doesn't, it's simply a picture, and maybe some notes, just like someone drew the car, gave you some notes on how fast they want it, how they want to to handle, etc.. You have to figure out what parts are needed to take this little image and bring it to life, take the design mockups and the notes on animations, user interactions, and then build out the html,css to visually represent it, and then add the javascript to make it interactive on the front-end, and then more javascript to interact with the back-end, and then even more for animations, logging user interaction for later analysis (google analytics custom events), etc...
I guess what this all comes down to is that it depends what you're doing with a language, and how much you're actually doing yourself vs simply implementing solutions already available (e.g. implementing a database sanitization library vs trying to create your own system to sanitize input). The language you do these things in is up to you, you could write a back-end in C++ if you wanted to, but I tend to favor python when it comes to these things, simply because most web applications are iterated frequently, meaning you're not building it to 1 spec, you're constantly pushing out revisions on a regular basis (e.g. new version each week with new features/changes). Python allows me to quickly and easily add new features, it might not be the fastest solution for large scale, but it would take more time to write a faster-executing solution, so from a production point of view, the money you save on engineers' time can be spent on a more powerful hosting solution, and you get the same end result for less, and the code doesn't turn into a giant mess because people did hackish messy solutions every time they were under a deadline.
The language you use is fairly meaningless, I have written assembly to simply inline some functions because my compiler wasn't optimizing properly, that task wasn't very challenging, some simple find and replace (with the help of a debugger), I've also written complex object oriented web applications entirely in interpreted languages (javascript and python). I'm doing the same things that I do in C++, creating a class system, interacting with libraries and APIs, writing some algorithms, maybe using some linear algebra, using a graphics library to build a dynamic and interactive UI. My graphics library in this case is the rendering engine in the browser.
TL;DR: Programming these days involves a lot of implementing solutions rather than creating your own, in front-end web development there are a lot of new problems because browser standards are constantly changing. Writing a rich web application in JavaScript and Python is similar to writing a desktop app in any other language, just with a web browser instead of the win32 API and a graphics library.
On March 07 2013 18:18 haduken wrote: Of course it can. You will have to define what you mean by apps though, is it a web application? a mobile application? a desktop GUI application? a background service?
Look at the list of interest areas that Python is supported in.
Ruby is good also but there are not much traction if you want to build a basic app with GUI (It can be done but not newb friendly), the most popular usage for ruby is in the ruby on rails framework (Websites, web applications)
You will want to match the language to what you are doing.
Thanks for the info, was looking for Mobile apps for the most part.
Mobile application is a very dynamic space and evolving very fast.
Basically we have Android ecosystem and Apple ecosystem (The only two that matters right now if you want your app to have any relevancy).
On Android, it's mostly Java (However, it's different because Android have different stack of libraries and framework. so not exactly the same)
On Iphone / Ipad, it's object C
On windows phone 8, it's C# targeting the windows phone SDK.
On black berry, is it HTML5 / Java script? not so sure but they are not even released into Market yet lol
There are third party solutions (Not from the manufacturer themselves) such as Ximian (cross platform, using C#/Mono) and plenty of others that utilize HTML5/Javascript.
The trend at the moment is going with HTML5/Javascript, basically designing a web app that can be displayed appropriately on mobile browser.
my roommate is a chemical engineer, and we took the same general first year together. He said he HATED programming and anything to d with deep understanding of computers because he found it to be extremely repetitive and frustrating when you could not figure out what to do. Unlike written language, you do not need to use it for every day life, and all it is is a way to handle machines. Thats it. However that could also be said for rural areas where some people still do not know how to read or write because it is not a part of their everyday life. Another thing is that programming languages come and go very quickly. From what I have heard, Ruby used to be very useful a decade ago but now its almost dead. For someone that wont be working with computers that much, learning a programming language will lose its value very quickly.
All in all, I say that programming is not like written language because it looses its value to the student VERY quickly.
On March 09 2013 13:03 WikidSik wrote: my roommate is a chemical engineer, and we took the same general first year together. He said he HATED programming and anything to d with deep understanding of computers because he found it to be extremely repetitive and frustrating when you could not figure out what to do. Unlike written language, you do not need to use it for every day life, and all it is is a way to handle machines. Thats it. However that could also be said for rural areas where some people still do not know how to read or write because it is not a part of their everyday life. Another thing is that programming languages come and go very quickly. From what I have heard, Ruby used to be very useful a decade ago but now its almost dead. For someone that wont be working with computers that much, learning a programming language will lose its value very quickly.
All in all, I say that programming is not like written language because it looses its value to the student VERY quickly.
If all the student learns is the syntax and keywords of programming languages then yeah it will loose value, but that's just rudimentary stuff; I don't see how that's different to any other discipline. Most professional programmers learn new syntax on the job.
If you want to know cooking but only learned how to cut veggies, no shit you are not going to cook anything decent. If all you want to speak French and only ever learned how to say bonjour no shit you suck at French.
However, the problem solving skills you developed will stay with you forever.
On March 07 2013 23:51 obesechicken13 wrote: Scratch is a language by MIT for beginners to learn. I watched a ted talk about it but it looks like it's not very powerful. As in it can't do shit. Python can be used for desktop apps but I think objective C is used more for GUI apps on the PC and C# on mobile devices. Afaik Ruby is used for websites as haduken says.
You've got it the other way around, C# is used more in GUI on PC (WPF/Winform), but it isn't limited to that. If you are doing C#, you are going to leverage .NET and its vast number of frameworks and libraries. You don't have to do it in C#, you can do it with any languages in the .NET CLR.
There are almost no presence of C# on mobile unless we are counting Windows Phone or Ximian / Mono
If you know the C syntax and need to do GUI app quickly for PC, C# will probably get you there very quickly.
Object C for all intends and purposes are exclusive for the apple ecosystem.
Right. I shouldn't have said anything.
Don't be so self-deprecatory lol, just a little mistake ><
You're right about Scratch, I'm pretty sure it exists basically just to teach coding logic to beginners or kids who know very little (if anything) about coding.
within this forum most of the people here would seem like they are coders, i worked on games for quite a while before i became a teacher, the discussion is coming from education especially in europe that programming should be taught in schools. as a teacher of computing i can tell you right now that programming would only captivate 5% of any year grp 11 -16 (5 kids per year group) as this is what ive seen over the last 10 years. Kids want games NOW so use those stupid programs like scratch, rpg maker and to some extent, flash by downloading code from somewhere else, compiling it and then running it. I can tell you i tried it with 136 11 yr olds last year 3 managed to COPY a document.write function into small basic and then add an input variable.... 3!!!!! out of 7 classes.
most will say above that the programs to build things quickj are NOt stupid. im telling you they are as they cant even see the if statement they create and then do one in excel. im going to go all out and say that under 16's dont have the patience and maturity to PROPERLY code and given the 2hrs a fornight i see each class in total there is not enough time being allocated to get people interested.
On March 10 2013 04:03 StatixEx wrote: within this forum most of the people here would seem like they are coders, i worked on games for quite a while before i became a teacher, the discussion is coming from education especially in europe that programming should be taught in schools. as a teacher of computing i can tell you right now that programming would only captivate 5% of any year grp 11 -16 (5 kids per year group) as this is what ive seen over the last 10 years. Kids want games NOW so use those stupid programs like scratch, rpg maker and to some extent, flash by downloading code from somewhere else, compiling it and then running it. I can tell you i tried it with 136 11 yr olds last year 3 managed to COPY a document.write function into small basic and then add an input variable.... 3!!!!! out of 7 classes.
most will say above that the programs to build things quickj are NOt stupid. im telling you they are as they cant even see the if statement they create and then do one in excel. im going to go all out and say that under 16's dont have the patience and maturity to PROPERLY code and given the 2hrs a fornight i see each class in total there is not enough time being allocated to get people interested.
I've been sitting through a college business course on how to develop an app. Right now we're using visual studios to put shit together. These mother fuckers think this is coding. I wish they could debug my goddamn computer science homework... I shoulda just taken accounting or something easy... On some days my homework assignments are just brutal. If I don't plan out properly and say "fuck it" and start coding on a blank slate, baaad things are to come... Hours are wasted on trying fix my program, and then if I ever do fix it, it comes out as some hideous creature. Like that homunculus from Full Metal Alchemist when Ed and Al tried to resurrect their mother, except in my case the professor would discretely comment on how terrible it was when he reviews. Thank god he's Russian so I can imagine him commenting my work in a Russian accent.
holy shit snuggles u sound like me when i did my game software development degree . . leavin that shit till the last minute is like sliding down a razor blade on ur balls then landing in a jug of vinegar . and then doin it again
On March 09 2013 13:03 WikidSik wrote: my roommate is a chemical engineer, and we took the same general first year together. He said he HATED programming and anything to d with deep understanding of computers because he found it to be extremely repetitive and frustrating when you could not figure out what to do. Unlike written language, you do not need to use it for every day life, and all it is is a way to handle machines. Thats it. However that could also be said for rural areas where some people still do not know how to read or write because it is not a part of their everyday life. Another thing is that programming languages come and go very quickly. From what I have heard, Ruby used to be very useful a decade ago but now its almost dead. For someone that wont be working with computers that much, learning a programming language will lose its value very quickly.
All in all, I say that programming is not like written language because it looses its value to the student VERY quickly.
Ruby is far from dead, it's just not really getting off. Unless you want to specifically use Ruby on Rails, there's not all that much going on in the Ruby world, Python is just so much more popular (and faster). It's even more annoying when you use ruby and are looking for libraries etc and every single resource just gladly assume you're using it for rails applications since every other use is so rare.