|
Thread Rules 1. This is not a "do my homework for me" thread. If you have specific questions, ask, but don't post an assignment or homework problem and expect an exact solution. 2. No recruiting for your cockamamie projects (you won't replace facebook with 3 dudes you found on the internet and $20) 3. If you can't articulate why a language is bad, don't start slinging shit about it. Just remember that nothing is worse than making CSS IE6 compatible. 4. Use [code] tags to format code blocks. |
On May 31 2012 22:27 sluggaslamoo wrote: ...snip...
These 3 languages is what I dub as part of the "Strousup" stream of OO, the one most people know and understand. The less understood form OO comes from Alan Kay, the inventor of OO and Smalltalk. Its no surprise, that OOP makes a ton more sense when using Smalltalk than when using C++. What Java is to C++, Ruby is to Smalltalk. Matz was inspired by a lot of Kay's principles when he invented Ruby.
I don't rate a language by its features, I rate it by its foundation, when a language has a good foundation, it doesn't need a million features to make it good. You can polish a turd, it still doesn't stop it from being a turd.
I think you're being a little unfair (or too fair to Ruby maybe?). All these languages are decedents of Smalltalk in one way or another. If we look at Ruby's inheritance model, we find it very similar to C++/Java/et al. The truth of the matter is that Ruby sits largely at the intersection of Perl and C++, trying to combine the flexibility we find in dynamic languages with the more traditional structure we see in Java. What's worse is that Ruby's foundation is so shaky that "monkey patching" has become common place. Further more there are large stretches of Smalltalk which both C++ and Ruby avoid. It'd be a fair cry to deign it the heir of Smalltalk's legacy.
I'm not saying it's a bad language (it's a fantastic language, one of my favorites), but I think you're being a little disingenuous with your criticism of C++ and Java. All languages have problems, the value of a language is not where it's problems come from, but what it does to solve them.
|
On May 31 2012 22:27 sluggaslamoo wrote:Show nested quote +On May 31 2012 20:57 ForgottenOne wrote:On May 30 2012 09:47 sluggaslamoo wrote: C++/Java/C#.net are pretty bad OO languages and are based on the Strousup stream of OO which was terribly designed in the first place.
There are much better functional/OO hybrids out there these days, but even Smalltalk which was the original true OO language (yes Simula was the first OO but I'm talking about in a practical sense) works completely different to mainstream OO languages.
See Ruby as a great example of a fully OO language which is based on Allen Kay's OO principles rather than Strousup's OO principles. Can you, please, point me to some good reads on these topics? The conclusion I made is from years of reading lots of different books/articles/websites, trying many different languages, understanding paradigms, knowing about the guys who invented each language, designing my own languages with PEGs, etc. I don't think you will be able to find an article on what I said specifically. The best starter point is to look up Alan Kay and Bjarne Strousup, the inventors of Smalltalk and C++ respectively. If I were to come up with a very brief reason to my conclusion, Alan Kay was the guy who championed OO in the first place, then Strousup created C++, and Kay thinks [in the most euphemistic way possible] C++ was "not what he had in mind". Java is based on C++, but tries to simply/fix C++ by constricting the paradigm, for example many interfaces/one concrete class inheritance, in order to avoid multiple inheritance problems like cyclic inheritance. IMO this is one of the really terrible fixes Java did to get around those problems. C# used to be called J#, until Sun didn't want Delegates to be a part of Java. So Microsoft branched off and created C#, which basically started off as Java, with Delegates. Being based on Java, its only a marginal improvement on its older brother, when compared to other languages. These 3 languages is what I dub as part of the "Strousup" stream of OO, the one most people know and understand. The less understood form OO comes from Alan Kay, the inventor of OO and Smalltalk. Its no surprise, that OOP makes a ton more sense when using Smalltalk than when using C++. What Java is to C++, Ruby is to Smalltalk. Matz was inspired by a lot of Kay's principles when he invented Ruby. I don't rate a language by its features, I rate it by its foundation, when a language has a good foundation, it doesn't need a million features to make it good. You can polish a turd, it still doesn't stop it from being a turd. Where do you think Python fits in with all this?
|
On June 01 2012 03:38 Millitron wrote:Show nested quote +On May 31 2012 22:27 sluggaslamoo wrote:On May 31 2012 20:57 ForgottenOne wrote:On May 30 2012 09:47 sluggaslamoo wrote: C++/Java/C#.net are pretty bad OO languages and are based on the Strousup stream of OO which was terribly designed in the first place.
There are much better functional/OO hybrids out there these days, but even Smalltalk which was the original true OO language (yes Simula was the first OO but I'm talking about in a practical sense) works completely different to mainstream OO languages.
See Ruby as a great example of a fully OO language which is based on Allen Kay's OO principles rather than Strousup's OO principles. Can you, please, point me to some good reads on these topics? The conclusion I made is from years of reading lots of different books/articles/websites, trying many different languages, understanding paradigms, knowing about the guys who invented each language, designing my own languages with PEGs, etc. I don't think you will be able to find an article on what I said specifically. The best starter point is to look up Alan Kay and Bjarne Strousup, the inventors of Smalltalk and C++ respectively. If I were to come up with a very brief reason to my conclusion, Alan Kay was the guy who championed OO in the first place, then Strousup created C++, and Kay thinks [in the most euphemistic way possible] C++ was "not what he had in mind". Java is based on C++, but tries to simply/fix C++ by constricting the paradigm, for example many interfaces/one concrete class inheritance, in order to avoid multiple inheritance problems like cyclic inheritance. IMO this is one of the really terrible fixes Java did to get around those problems. C# used to be called J#, until Sun didn't want Delegates to be a part of Java. So Microsoft branched off and created C#, which basically started off as Java, with Delegates. Being based on Java, its only a marginal improvement on its older brother, when compared to other languages. These 3 languages is what I dub as part of the "Strousup" stream of OO, the one most people know and understand. The less understood form OO comes from Alan Kay, the inventor of OO and Smalltalk. Its no surprise, that OOP makes a ton more sense when using Smalltalk than when using C++. What Java is to C++, Ruby is to Smalltalk. Matz was inspired by a lot of Kay's principles when he invented Ruby. I don't rate a language by its features, I rate it by its foundation, when a language has a good foundation, it doesn't need a million features to make it good. You can polish a turd, it still doesn't stop it from being a turd. Where do you think Python fits in with all this?
Python is a high level scripting language, different branch on the tree of languages. Think along the lines of perl or scheme evolutions. It is actually pretty good as a glue language, and can functionally do as much as Java/C++, though because it is not compiled it loses many advantages that Java and C++ have.
|
On May 31 2012 22:27 sluggaslamoo wrote:Show nested quote +On May 31 2012 20:57 ForgottenOne wrote:On May 30 2012 09:47 sluggaslamoo wrote: C++/Java/C#.net are pretty bad OO languages and are based on the Strousup stream of OO which was terribly designed in the first place.
There are much better functional/OO hybrids out there these days, but even Smalltalk which was the original true OO language (yes Simula was the first OO but I'm talking about in a practical sense) works completely different to mainstream OO languages.
See Ruby as a great example of a fully OO language which is based on Allen Kay's OO principles rather than Strousup's OO principles. Can you, please, point me to some good reads on these topics? The conclusion I made is from years of reading lots of different books/articles/websites, trying many different languages, understanding paradigms, knowing about the guys who invented each language, designing my own languages with PEGs, etc. I don't think you will be able to find an article on what I said specifically. The best starter point is to look up Alan Kay and Bjarne Strousup, the inventors of Smalltalk and C++ respectively. If I were to come up with a very brief reason to my conclusion, Alan Kay was the guy who championed OO in the first place, then Strousup created C++, and Kay thinks [in the most euphemistic way possible] C++ was "not what he had in mind". Java is based on C++, but tries to simply/fix C++ by constricting the paradigm, for example many interfaces/one concrete class inheritance, in order to avoid multiple inheritance problems like cyclic inheritance. IMO this is one of the really terrible fixes Java did to get around those problems. C# used to be called J#, until Sun didn't want Delegates to be a part of Java. So Microsoft branched off and created C#, which basically started off as Java, with Delegates. Being based on Java, its only a marginal improvement on its older brother, when compared to other languages. These 3 languages is what I dub as part of the "Strousup" stream of OO, the one most people know and understand. The less understood form OO comes from Alan Kay, the inventor of OO and Smalltalk. Its no surprise, that OOP makes a ton more sense when using Smalltalk than when using C++. What Java is to C++, Ruby is to Smalltalk. Matz was inspired by a lot of Kay's principles when he invented Ruby. I don't rate a language by its features, I rate it by its foundation, when a language has a good foundation, it doesn't need a million features to make it good. You can polish a turd, it still doesn't stop it from being a turd. I read in the meantime about Alan Kays definition of OO. Can you explain which of the six points he uses to describe what an OO language is, doesn't apply to Java?
|
After much bragging about how helpful this thread is, I have a question from a friend:
"I have hopefully a simple question regarding javascript for webpages. I am using a news ticker that scrolls messages across the screen that was developed using js. In Dreamweaver when I click the live option the ticker works as expected and the messages scroll. But when I check the actual site address as what I assume Dreamweaver is using the ticker no longer works. Any advice is much appreciated "
|
On June 01 2012 06:29 PachaL wrote: After much bragging about how helpful this thread is, I have a question from a friend:
"I have hopefully a simple question regarding javascript for webpages. I am using a news ticker that scrolls messages across the screen that was developed using js. In Dreamweaver when I click the live option the ticker works as expected and the messages scroll. But when I check the actual site address as what I assume Dreamweaver is using the ticker no longer works. Any advice is much appreciated " i don't think anyone can answer such a unspecific question. You'll have to provide some material like source code snippets and URLs.
|
On June 01 2012 01:24 tzenes wrote:Show nested quote +On May 31 2012 22:27 sluggaslamoo wrote: ...snip...
These 3 languages is what I dub as part of the "Strousup" stream of OO, the one most people know and understand. The less understood form OO comes from Alan Kay, the inventor of OO and Smalltalk. Its no surprise, that OOP makes a ton more sense when using Smalltalk than when using C++. What Java is to C++, Ruby is to Smalltalk. Matz was inspired by a lot of Kay's principles when he invented Ruby.
I don't rate a language by its features, I rate it by its foundation, when a language has a good foundation, it doesn't need a million features to make it good. You can polish a turd, it still doesn't stop it from being a turd.
I think you're being a little unfair (or too fair to Ruby maybe?). All these languages are decedents of Smalltalk in one way or another. If we look at Ruby's inheritance model, we find it very similar to C++/Java/et al. The truth of the matter is that Ruby sits largely at the intersection of Perl and C++, trying to combine the flexibility we find in dynamic languages with the more traditional structure we see in Java. What's worse is that Ruby's foundation is so shaky that "monkey patching" has become common place. Further more there are large stretches of Smalltalk which both C++ and Ruby avoid. It'd be a fair cry to deign it the heir of Smalltalk's legacy. I'm not saying it's a bad language (it's a fantastic language, one of my favorites), but I think you're being a little disingenuous with your criticism of C++ and Java. All languages have problems, the value of a language is not where it's problems come from, but what it does to solve them.
All these languages are decedents of Smalltalk in one way or another.
The elephant in the room is extreme late-binding. Ruby has ducktyping, ad-hoc inheritance, open classes, and method_missing. Java has none of these. Polymorphism is the closest Java got, but it is also an unnecessary extra layer of complexity that the programmer has to think about. Extreme late binding is what makes my code-base 1/4 of the size of a java code base of similar complexity and amazingly readable.
Remember I talked about foundation, C++/Java/C# will never ever be able to achieve something like this, because its foundation is based on static/strict typing, JIT compilation, and compile time checks.
The truth of the matter is that Ruby sits largely at the intersection of Perl and C++, trying to combine the flexibility we find in dynamic languages with the more traditional structure we see in Java
Not really the truth of the matter. I don't see how it sits at the intersection of Perl and C++, if it does, then basically that is the same for every general purpose dynamic language. It has some perl influence in regards to syntax but that's about it.
If we look at Ruby's inheritance model, we find it very similar to C++/Java/et al
Not at all. It uses mixins for a start which IMO is a much much better way of avoiding cyclic inheritance problems with multiple inheritance, than just simply not allowing multiple inheritance like Java does. Multiple inheritance is still a necessary feature for code-use, which is why Java creates so much code-bloat in non-trivial projects. Other ways of working around this is traits, which Scala uses. Ruby also has ad-hoc inheritance, and much more flexible anonymous classes.
I have done a lot of Java coding and Ruby coding. I even used to be somewhat of a Java elitist before I expanded my horizons onto lots of other languages. I have to tell you that the process for OO is so different in Ruby and Java, that I have a completely different mindset when coding in each language. However between C++/Java/C#, I am using the cookie cutter OO mindset.
What's worse is that Ruby's foundation is so shaky that "monkey patching" has become common place
Shaky? What?
This just tells me you have barely coded any Ruby. Open classes are what makes Ruby my favorite language, in my experience monkey patching has never ever been a problem, the elegance that can be achieved from monkey patching however is extremely worth the "risk".
http://peepcode.com/blog/2010/what-pythonistas-think-of-ruby
Rspec is a great example of why monkey patching is good. In fact a lot of libraries these days do it like Rspec, instead of adding methods to their own library, they create modules to allow you to patch existing classes that you already use. So its much simpler to read and understand and there is a lot less learning required.
Open classes are what makes extremely easy DSL creation possible, which in turn, promotes elegant syntax and tiny code bases.
http://www.sinatrarb.com/ https://github.com/btakita/rr http://sunspot.github.com/ http://robots.thoughtbot.com/post/7176629856/factory-girls-new-look
The other reason monkey patching is good, is that I always think of some really cool or necessary feature I want to add onto a library, I will simply patch it in, once I've tested it, I might create a pull request on the author's git repository so other people can use it. With other languages I have to download the whole library, put in my patch, recompile it, and then tell everybody to use that library. Very very often you will want to patch the String class, I am always patching in a "slug" and "human_name" method, one time I patched in a method so that the search string would work better with Solr.
"hello world".search_string is much more elegant than String.searchString("hello world")
Monkey patching is probably my favorite feature in Ruby.
Further more there are large stretches of Smalltalk which both C++ and Ruby avoid. It'd be a fair cry to deign it the heir of Smalltalk's legacy.
Like I said, Java is to C++ as Ruby is to Smalltalk. Some people will tell you that Java is nothing like C++, even I would agree with that. However there is a ton of Smalltalk influence in Ruby's design.
"I always thought that Smalltalk would beat Java, I just didn't know that it would be called 'Ruby' when it did" - Kent Beck
On June 01 2012 03:38 Millitron wrote: Where do you think Python fits in with all this?
Python is a functional/imperative language, and have their own "pythonic" way of doing things, it doesn't really fit in at all. Python has very light OO. 
On June 01 2012 04:25 ForgottenOne wrote: I read in the meantime about Alan Kays definition of OO. Can you explain which of the six points he uses to describe what an OO language is, doesn't apply to Java?
Don't get me wrong, Java is still very much OO . Its just a different style of OO that is derived from the C++ way of doing things, rather than the Smalltalk way. Have a look for what Kay meant when he said C++ was not what he had in mind, theres a string of emails where he talks about what he means by that and his definitions of what he believes is OO, you might have read that already not sure, but I think that should help answer your question somewhat.
Here's a short discussion by some random people on the topic that should provide some food for thought in the meantime. http://news.ycombinator.com/item?id=2335754
I will get back to you soon when I have time to actually sit down and think, when I'm not at work and not going out
|
On May 30 2012 08:15 RoyGBiv_13 wrote:Show nested quote +On May 30 2012 07:38 Millitron wrote: Speaking of heated debate topics, why is GO TO considered such a bad feature for a language to have?
I mean, I get that if you are completely careless with it, you can access memory that hasn't been allocated for your program, but it's extremely hard to screw up that bad. You almost have to force it to happen.
The thing is though, GO TO provides an effective way of exiting complex control structures. It is a gigantic pain to try to get a large group of nested for-loops to terminate early without GO TO, but with GO TO, it can be done with the addition of a single if-statement.
I was making a Turing Machine simulator for a class I was taking. The simulator was written in Java. The state of the machine ended up being stored in a 4D array, meaning it had to be accessed by a nesting of for-loops. It was such a pain to break out of those loops without GO TO. Lets talk performance. Ill start with GOTO because its an easy target: Back in ye olden BASIC times, GOTO's were the go to (heh) method for program flow. This was alright, and did not typically affect speed of computations nor code size in any significant manner, because the code assembled almost directly into its literal machine code translation, with no optimizations. Then came Dennis Ritchie and C. A compiled language will look almost nothing like itself when turned into machine code. Most notably is the inclusion of the stack and functions. Function calls replaced the LABEL A -> GOTO B -> GOTO A spaghetti by including the address of the branch to instruction onto the stack, and branching to the stack popping upon returning form a function. This is incredibly faster as a processor optimized for branch prediction would almost always have the next instruction loaded into the cache, compared to the GOTO method.
Whatchu talkin bout. A GOTO is just a JMP (unconditional branch), and a function CALL is logically a PUSH EIP + JMP. A RET is just a POP EIP (in cdecl anyway). Any type of branch prediction would apply equally to all of them.
I like to use GOTO for things like cleanup code at the end of a function so that I can bail out mid-function in error cases and only write the free()s and whatnot in one place. Yeah I could write the cleanup code as a function but that is harder to follow than the GOTO. Also if I add another malloc for something in the function, I have to go change the cleanup function parameters AND every single place where the cleanup function is called, so I am back to where I started in terms of having to remember to update the code in multiple places.
|
On June 01 2012 12:18 sluggaslamoo wrote:The elephant in the room is extreme late-binding. Ruby has ducktyping, ad-hoc inheritance, open classes, and method_missing. Java has none of these. Polymorphism is the closest Java got, but it is also an unnecessary extra layer of complexity that the programmer has to think about. Extreme late binding is what makes my code-base 1/4 of the size of a java code base of similar complexity and amazingly readable. Remember I talked about foundation, C++/Java/C# will never ever be able to achieve something like this, because its foundation is based on static/strict typing, JIT compilation, and compile time checks. I think you mean interfaces is the closest Java has, as it doesn't implement C++ style multiple-inheritance (C# actually has ducktyping and mixins, so I'm not sure where you're going there...). It sounds like your big complaint here has nothing to do with OOP, but rather dynamic vs static languages. Java style interfaces allow polymorphism without inheritance (just like Ruby/Python's dynamic typing), thus alleviating the same problem. The only difference is that you have to declare them at compile time instead of runtime (though you could argue that generics provide a runtime level of support, but that's really neither here nor there). It sounds like your real complaint is that you don't like static languages.
I will side step the conversation for a second to point out that the JIT is perhaps the shrewdest thing Sun ever did for Java, and one of things I hope strongly that the Ruby community adopts and it will dramatically improve performance at runtime.
Not really the truth of the matter. I don't see how it sits at the intersection of Perl and C++, if it does, then basically that is the same for every general purpose dynamic language. It has some perl influence in regards to syntax but that's about it.
Ruby's ideas of "many ways to do the same thing" and "convention over configuration" both put it strongly as a successor to Perl (as opposed to Python which rejects both ideas). Matz even said: "Ruby inherited the Perl philosophy of having more than one way to do the same thing. I inherited that philosophy from Larry Wall, who is my hero actually." To suggest that it had only minor influence is less than accurate.
Not at all. It uses mixins for a start which IMO is a much much better way of avoiding cyclic inheritance problems with multiple inheritance, than just simply not allowing multiple inheritance like Java does. Multiple inheritance is still a necessary feature for code-use, which is why Java creates so much code-bloat in non-trivial projects. Other ways of working around this is traits, which Scala uses. Ruby also has ad-hoc inheritance, and much more flexible anonymous classes.
I have done a lot of Java coding and Ruby coding. I even used to be somewhat of a Java elitist before I expanded my horizons onto lots of other languages. I have to tell you that the process for OO is so different in Ruby and Java, that I have a completely different mindset when coding in each language. However between C++/Java/C#, I am using the cookie cutter OO mindset.
Again, Java doesn't have multiple inheritance, so I'm not sure where you're getting this from.
Shaky? What? This just tells me you have barely coded any Ruby. Open classes are what makes Ruby my favorite language, in my experience monkey patching has never ever been a problem, the elegance that can be achieved from monkey patching however is extremely worth the "risk". http://peepcode.com/blog/2010/what-pythonistas-think-of-ruby...snip... Monkey patching is probably my favorite feature in Ruby. I have also heard Gary Bernhardt's talk of Python vs Ruby. For those reading a long you can watch it here: http://vimeo.com/9471538. But that would be appropriate for a discussion of Python vs Ruby, and less so for a discussion of Java similarities to Smalltalk vs Ruby's similarities to Smalltalk. Listening to it again, he doesn't laud Ruby for this approach, he just suggests that what it creates can be ideal. He even criticizes monkey patching earlier in that same talk when referring to how Activerecord changes the symbol type to support the "pretzel" operator and how that causes portability and legacy coding problems when upgrading from Ruby 1.8.6 to 1.8.7. What if the implementation has been different in some cases? What if it had broken the code of everyone upgrading who used this operator? You could cause bifurcation in your community (*cough* like python 3 *cough*).
This is a bad thing. If your monkey patching core types in your language you have to ask yourself, why didn't the language support that in the first place? and what's more: are you really still writing the same language? Don't get wrong, I love ActiveRecord (and all of Rails) and Rspec, but how far can they bend the rules before we start to see things break?
Show nested quote +Further more there are large stretches of Smalltalk which both C++ and Ruby avoid. It'd be a fair cry to deign it the heir of Smalltalk's legacy. Like I said, Java is to C++ as Ruby is to Smalltalk. Some people will tell you that Java is nothing like C++, even I would agree with that. However there is a ton of Smalltalk influence in Ruby's design. I think what I'm trying to communicate is not that Ruby doesn't borrow for Smalltalk or that Java doesn't borrow from C++, but that all three borrow a ton from Smalltalk (as does C#). I think to suggest that any one of the borrows more or less is unfair (and if Kent Beck wants to be unfair, that's his prerogative).
|
On June 01 2012 14:00 tzenes wrote:
I think you mean interfaces is the closest Java has, as it doesn't implement C++ style multiple-inheritance (C# actually has ducktyping and mixins, so I'm not sure where you're going there...). It sounds like your big complaint here has nothing to do with OOP, but rather dynamic vs static languages. Java style interfaces allow polymorphism without inheritance (just like Ruby/Python's dynamic typing), thus alleviating the same problem. The only difference is that you have to declare them at compile time instead of runtime (though you could argue that generics provide a runtime level of support, but that's really neither here nor there). It sounds like your real complaint is that you don't like static languages.
It sounds like you need to brush up on your OO definitions. You are trying to correct me, because you don't understand [fully] a lot of the words you are using, which is why you are mis-understanding what I am saying.
Like I said the first time polymorphism is the closest Java has. C# does not have ducktyping or mixins (ver 4 has a really shitty version of duck typing and traits, which really isn't really duck typing, and the implementation of traits is downright ugly, my analogy of polishing a turd comes to mind). Do you know what binding means? Polymorphism is the only kind of late binding in Java, but its still not "extreme-late" binding.
Java style interfaces allow polymorphism without inheritance (just like Ruby/Python's dynamic typing), thus alleviating the same problem.
I just covered this so explicitly that I don't see how you could have missed it.
Again, removing multiple implementation inheritance is the worst thing Java could have done to fix cyclic inheritance problems. OO is supposed to promote code re-use, and this "fix" just flies in the face of that principle. Your definition of inheritance is wrong, Java style interfaces must be inherited to allow for classes to bind to a different Type. There's two types of inheritance in Java, implementation inheritance, and interface inheritance. Both are inherited.
It sounds like your real complaint is that you don't like static languages
Isn't that what I partly meant, when I talked about foundation?
I will side step the conversation for a second to point out that the JIT is perhaps the shrewdest thing Sun ever did for Java, and one of things I hope strongly that the Ruby community adopts and it will dramatically improve performance at runtime.
JRuby already does this and you can even inline Java. Its still not that much faster, MacRuby on the other hand ...
I have also heard Gary Bernhardt's talk of Python vs Ruby. For those reading a long you can watch it here: http://vimeo.com/9471538. But that would be appropriate for a discussion of Python vs Ruby, and less so for a discussion of Java similarities to Smalltalk vs Ruby's similarities to Smalltalk. Listening to it again, he doesn't laud Ruby for this approach, he just suggests that what it creates can be ideal. He even criticizes monkey patching earlier in that same talk when referring to how Activerecord changes the symbol type to support the "pretzel" operator and how that causes portability and legacy coding problems when upgrading from Ruby 1.8.6 to 1.8.7. What if the implementation has been different in some cases? What if it had broken the code of everyone upgrading who used this operator? You could cause bifurcation in your community (*cough* like python 3 *cough*). This is a bad thing. If your monkey patching core types in your language you have to ask yourself, why didn't the language support that in the first place? and what's more: are you really still writing the same language? Don't get wrong, I love ActiveRecord (and all of Rails) and Rspec, but how far can they bend the rules before we start to see things break?
That article was posted in the context of my argument. There is a hint of anti-ruby in there as well, which should mean it is on the more objective spectrum from my standpoint. I was implying that Rspec could not have been made without monkey patching, and it could not be ported to a language unless it allows it. Rspec is THE most popular testing library within Ruby, it is so popular even, that it is often being used for non-Ruby apps along with Cucumber.
I have NEVER EVER EVER had an issue with monkey patching, EVER. So those problems don't even exist, even if they have the potential to. The pros HEAVILY outweigh the cons, again you probably won't get this until you spend a couple years with Ruby.
I completely understand the potential risks and I was even unsure at first, but after using Ruby for a long time, I have never had a problem with it. Use the language as part of your job for 2 years, and then come back and tell me monkey patching is bad. Practical Experience > Theoretical.
but how far can they bend the rules before we start to see things break?
Still trying, nothing seems to be "broken". In fact, they make me get the job done about 1000x faster.
http://cukes.info/ http://slim-lang.com/
I think I should explain why nothing breaks. Professionals should employ BDD and TDD as part of their SDLC. With 100% test-coverage, this means that error rates are extremely low if not 0. If I monkey patch something, I simply run the tests and check to see if nothing is broken. That's why things don't "break".
I think what I'm trying to communicate is not that Ruby doesn't borrow for Smalltalk or that Java doesn't borrow from C++, but that all three borrow a ton from Smalltalk (as does C#). I think to suggest that any one of the borrows more or less is unfair (and if Kent Beck wants to be unfair, that's his prerogative).
Something is unfair when its true? What?
Ruby borrows more from Smalltalk than C++. Its so obvious I dunno why we are arguing about it o_O.
|
Ruby's ideas of "many ways to do the same thing" and "convention over configuration" both put it strongly as a successor to Perl (as opposed to Python which rejects both ideas). Matz even said: "Ruby inherited the Perl philosophy of having more than one way to do the same thing. I inherited that philosophy from Larry Wall, who is my hero actually." To suggest that it had only minor influence is less than accurate.
Yeah but if you actually use Ruby, you would realise its not a lot like Perl. "Convention over configuration" and "many ways to do the same thing", is so general it could be applied to any language.
|
@PachaL about using a webserver for Excel files and iPad app: I think the easiest start web programming is with PHP. There are also excel libaries for PHP. Some other alternatives are Ruby on Rails, C#/Asp.net MVC.
I have to warn you though, that it is a lot of extra work if you never did any web programming before. You need a webserver (probably paid), you need to learn a web programming language (PHP for example), communication between app and website, security, etc..
If your comfort zone is more C/C++/Objective-C it may be easier to look in the following direction: It seems it is possible to use the OpenOffice API with Objective-C (http://stackoverflow.com/questions/3587004/is-there-a-library-or-example-for-creating-excel-xlsx-files) Googling for "objective c excel" finds some possiblities but I can't comment on what is the best option.
Also, you could first finish v1 of your app with CSV files as output, then you have a finished "product" and a good milestone. Then you can look at XLSX output for version 2. By splitting up the work like this, you have clear goals and accomplishments that keep you motivated.
|
On June 01 2012 16:05 sluggaslamoo wrote:
It sounds like you need to brush up on your OO definitions. You are trying to correct me, because you don't understand [fully] a lot of the words you are using, which is why you are mis-understanding what I am saying.
Like I said the first time polymorphism is the closest Java has. C# does not have ducktyping or mixins (ver 4 has a really shitty version of duck typing and traits, which really isn't really duck typing, and the implementation of traits is downright ugly, my analogy of polishing a turd comes to mind). Do you know what binding means? Polymorphism is the only kind of late binding in Java, but its still not "extreme-late" binding.
I think you're being a little offensive now, though I did enjoy where you said C# doesn't have ducktyping, and then contradicted yourself. C# has both ducktyping and mixins (go look at re-mix it's part of the re-motion lib), I'm sorry you find having to add the keyword "dynamic" so ugly (though this is a static language which means you do have to be explicit, but you hate static languages so there is that...). However, strangest of all is your constant assertion that Java is so strongly linked to C++ in your context of mixins. Of all the languages we've discussed Java is the only one not to support multiple inheritance, Ruby and C++ implemented it natively (and C# via libraries).
Show nested quote +Java style interfaces allow polymorphism without inheritance (just like Ruby/Python's dynamic typing), thus alleviating the same problem. I just covered this so explicitly that I don't see how you could have missed it. Again, removing multiple implementation inheritance is the worst thing Java could have done to fix cyclic inheritance problems. OO is supposed to promote code re-use, and this "fix" just flies in the face of that principle. Your definition of inheritance is wrong, Java style interfaces must be inherited to allow for classes to bind to a different Type. There's two types of inheritance in Java, implementation inheritance, and interface inheritance. Both are inherited.
Interface inheritance is not classical inheritance, it does not follow the C++ conventions. Instead its a description of a common set of methods or properties that different object implement. Sound familiar? it should because its very close to the definition ducktyping (it is not dynamic/late like ducktyping is). They both serve the same purpose: to allow us to call the same method across multiple objects even if the produce differing results. The real difference between the two approaches is that in Java the programmer has to be explicit and in Ruby he does not (in C# the programmer has to say "this might happen later, I don't know I'm going to go back to coding").
Isn't that what I partly meant, when I talked about foundation? If you want to have a discussion of static vs dynamic languages I'm all for it, but that is a very different discussion. If we were talking about that I'd discuss the differences between static and strong, and talk about languages like Haskell (I don't find Java's typing to be strong enough). But that's a very different discussion than their relationships to Smalltalk.
Show nested quote +I will side step the conversation for a second to point out that the JIT is perhaps the shrewdest thing Sun ever did for Java, and one of things I hope strongly that the Ruby community adopts and it will dramatically improve performance at runtime. JRuby already does this and you can even inline Java. Its still not that much faster, MacRuby on the other hand ... also has a JIT? I'm not sure where you were going with that sentence. JRuby (and MacRuby) have just-in-time compilation for Java and Objective-C (respectively), not for Ruby. Much like just-in-time compilation for Assembly wouldn't provide much benefit to C, a JIT for something that isn't Ruby isn't going to provide as much benefit to Ruby as a Ruby JIT would (though JRuby did out perform the 1.8 interpretor, until yarv provided a great leap forward).
Show nested quote +I have also heard Gary Bernhardt's talk of Python vs Ruby. For those reading a long you can watch it here: http://vimeo.com/9471538. But that would be appropriate for a discussion of Python vs Ruby, and less so for a discussion of Java similarities to Smalltalk vs Ruby's similarities to Smalltalk. Listening to it again, he doesn't laud Ruby for this approach, he just suggests that what it creates can be ideal. He even criticizes monkey patching earlier in that same talk when referring to how Activerecord changes the symbol type to support the "pretzel" operator and how that causes portability and legacy coding problems when upgrading from Ruby 1.8.6 to 1.8.7. What if the implementation has been different in some cases? What if it had broken the code of everyone upgrading who used this operator? You could cause bifurcation in your community (*cough* like python 3 *cough*). This is a bad thing. If your monkey patching core types in your language you have to ask yourself, why didn't the language support that in the first place? and what's more: are you really still writing the same language? Don't get wrong, I love ActiveRecord (and all of Rails) and Rspec, but how far can they bend the rules before we start to see things break? That article was posted in the context of my argument. There is a hint of anti-ruby in there as well, which should mean it is on the more objective spectrum from my standpoint. I was implying that Rspec could not have been made without monkey patching, and it could not be ported to a language unless it allows it. Rspec is THE most popular testing library within Ruby, it is so popular even, that it is often being used for non-Ruby apps along with Cucumber. I have NEVER EVER EVER had an issue with monkey patching, EVER. So those problems don't even exist, even if they have the potential to. The pros HEAVILY outweigh the cons, again you probably won't get this until you spend a couple years with Ruby. Here is what I read: "I have never had an issue, so those problems don't even exist." There are so very many ways to prove such a statement wrong that I'll leave them as an exercise to the reader.
I completely understand the potential risks and I was even unsure at first, but after using Ruby for a long time, I have never had a problem with it. Use the language as part of your job for 2 years, and then come back and tell me monkey patching is bad. Practical Experience > Theoretical.
An ad hominem attack? I suppose I shouldn't be surprised. If it helps you out, I've been writing Ruby code since '07 (I even had the joys of porting apps from Rails 1 to Rails 2, that was a fun nightmare...). The truth is monkey patching has bit me, a number of times (though not as many times in Ruby as Javascript). I'm happy for you that you've never run into this problem (I don't know the size of code base you've worked with), but I've definitely run into it. I don't think "method_missing" is my least favorite part of Ruby (and Groovy), but it has to be 10 ten. Gary Bernhardt has another great talk (in a more Ruby positive light) where he shows that Ruby can implement bare words via method missing, and everyone has a good laugh... except I once had to work with code that did that. It's always fun when a typo in your code doesn't throw an error but rather breaks things 1000 function calls later and you have to slowly walk through every line of code to find it (I though I had rid myself of such errors when I stopped writing C).
None of that means that RSpec is bad, or that what it gains from monkey patching is bad. Monkey patching itself isn't inherently good or bad (like mucking with Perl's symbol tables...), but what it does do is expose (and usually fix) a flaw in the underlying language. If Ruby were a better language we wouldn't need to monkey patch. It's an admission that the language is imperfect (as all languages are). If we had another operator (let's say the - operator) which changed a function from prefix notation to infix notation. We'd defined should(a,b) and then call a-should b, and all we've done is swap the - for a . and get all the benefits of RSpec without monkey patching core types (not all of them, there is some fun metaprogramming that RSpec does, but that doesn't require monkey patching).
Still trying, nothing seems to be "broken". In fact, they make me get the job done about 1000x faster. http://cukes.info/http://slim-lang.com/I think I should explain why nothing breaks. Professionals should employ BDD and TDD as part of their SDLC. With 100% test-coverage, this means that error rates are extremely low if not 0. If I monkey patch something, I simply run the tests and check to see if nothing is broken. That's why things don't "break". "Well use BDD and TDD in our Agile mode and nothing will ever break." How many managers have I heard that from? It's a lie. All software breaks. I challenge you to find any major software company which hasn't had an outage over the last year. Testing isn't a magic bullet. With good unit tests and solid integration tests you can drive down bugs and reduce error rates, but they won't be 0 or even close.
|
On June 02 2012 00:09 tzenes wrote:
What do you expect? you gotta stop trying to explain things you don't even understand, that's what makes me angry. You are trying to find holes in my argument, but you sound silly trying to do it and trying to sound smart by making contrasts that don't even make sense. It would make me a lot more happy if you actually didn't keep putting words in my mouth, requiring me to say the same thing, over and over just to make it clear.
Interface inheritance is not classical inheritance, it does not follow the C++ conventions. Instead its a description of a common set of methods or properties that different object implement. Sound familiar? it should because its very close to the definition ducktyping (it is not dynamic/late like ducktyping is). They both serve the same purpose: to allow us to call the same method across multiple objects even if the produce differing results. The real difference between the two approaches is that in Java the programmer has to be explicit and in Ruby he does not (in C# the programmer has to say "this might happen later, I don't know I'm going to go back to coding").
This paragraph makes no sense whatsoever. What does interfaces have to do with the definition of duck typing? The definition of duck-typing is simply this, "if it quacks like a duck, it is a duck", interfaces serve a completely different role, a much better name for it is Protocol, which is what obj-C calls it, because that's what it is, a protocol. Duck-typing and interfaces do not serve the same purpose. The paradigm is completely different.
In fact the one thing I miss in Ruby is "interfaces". I would love the ability to have both protocols and duck-typing.
C# does not have mixins or duck-typing. It has traits, and some really uglified form of duck-typing but I don't know what to call it, I wouldn't call it ducktyping though, more like duck-type-casting. Just because someone tells you 2+2=5, doesn't mean its true, even if its Microsoft.
If Ruby were a better language we wouldn't need to monkey patch
None of that means that RSpec is bad, or that what it gains from monkey patching is bad. Monkey patching itself isn't inherently good or bad (like mucking with Perl's symbol tables...), but what it does do is expose (and usually fix) a flaw in the underlying language.
All along you have been implying that monkey patching is bad.
What's worse is that Ruby's foundation is so shaky that "monkey patching" has become common place.
If monkey patching is bad, then RSpec must be bad, but you love RSpec. Double standards, for some reason when a library does it right, its great, but otherwise the feature is bad. Makes no sense. Go use only Ruby libraries that don't use Open Classes, so basically none of the good ones. Don't use RSpec, don't use Cucumber, don't use RR, don't use Haml, don't use ActiveRecord. What the hell do you have left?
Ruby IS Open Classes. That's what makes Ruby, Ruby. If you don't like open classes, you use a different language like Scala or Python. So far Ruby is pulling way ahead of both languages, if not all languages, for a reason. Open Classes and Method_Missing has allowed for the creation of the most useful dev libaries that other language devs just wish they could have, but just can't. I wouldn't be using Ruby if not for the wide-array of DSL's that make my like a hell of a lot easier.
"Well use BDD and TDD in our Agile mode and nothing will ever break." How many managers have I heard that from? It's a lie. All software breaks. I challenge you to find any major software company which hasn't had an outage over the last year. Testing isn't a magic bullet. With good unit tests and solid integration tests you can drive down bugs and reduce error rates, but they won't be 0 or even close
Then you aren't doing your TDD/BDD right. One time me and my pair, spent a week refactoring a subset of a gigantic website, kept going until all tests were passed, we only had 87% code-coverage, no bugs. If we had bugs we would have been notified immediately (we had millions of users per day).
Gary Bernhardt has another great talk (in a more Ruby positive light) where he shows that Ruby can implement bare words via method missing, and everyone has a good laugh... except I once had to work with code that did that. It's always fun when a typo in your code doesn't throw an error but rather breaks things 1000 function calls later and you have to slowly walk through every line of code to find it (I though I had rid myself of such errors when I stopped writing C).
Ah, the moral of this story is, don't hire terrible developers? In Java I can put all my code in one class, that doesn't make classes bad. Bad code can be written in any language. You won't have any problems with open classes or method_missing with a properly tested library. Your app never suddenly crashed because you bundled RSpec did it?
|
On June 02 2012 00:09 tzenes wrote:If you want to have a discussion of static vs dynamic languages I'm all for it, but that is a very different discussion. If we were talking about that I'd discuss the differences between static and strong, and talk about languages like Haskell (I don't find Java's typing to be strong enough). But that's a very different discussion than their relationships to Smalltalk.
tbf I just don't really wanna have a discussion at all anymore. This discussion is just so far of-course from my original statements that I wanted to make about Alan Kay and Strousup, and OO.
|
Im trying to learn java before beginning my studies at a universtity. I have written quite a lot of things in C, but my experience with classes is limited.
So anyone have some tips/a good resource for using classes the right way?
|
@discussion above about OO You shouldn't get all worked up over this man....
@ZappaSC Java is bad, you can put all your code in one class so the language sux!
|
Hyrule18980 Posts
simmer down people
can't we all just agree that cobol is awful and move on?
|
On June 02 2012 01:21 alwinuz wrote:@ZappaSC Java is bad, you can put all your code in one class so the language sux! 
Not really hopefull, since i wont be choosing Java, its just choosen for me. Anyothers?
|
On June 02 2012 02:31 ZappaSC wrote:Show nested quote +On June 02 2012 01:21 alwinuz wrote:@ZappaSC Java is bad, you can put all your code in one class so the language sux!  Not really hopefull, since i wont be choosing Java, its just choosen for me. Anyothers? I think he was poking fun at the heated discussion above about the merits of Java as opposed to smalltalk.
Honestly, you'd be hard-pressed to find a bad language. They all have their advantages and disadvantages.*
*Except VBA. VBA is bad, and Microsoft should feel bad for inventing it.
|
|
|
|