I think it maybe has been overemphasized a little but it is also true that a solid understanding of asymptotic complexity and growth is very important if your doing work with huge volumes of information.
The Big Programming Thread - Page 997
| Forum Index > General Forum |
Thread Rules 1. This is not a "do my homework for me" thread. If you have specific questions, ask, but don't post an assignment or homework problem and expect an exact solution. 2. No recruiting for your cockamamie projects (you won't replace facebook with 3 dudes you found on the internet and $20) 3. If you can't articulate why a language is bad, don't start slinging shit about it. Just remember that nothing is worse than making CSS IE6 compatible. 4. Use [code] tags to format code blocks. | ||
|
Deleted User 3420
24492 Posts
I think it maybe has been overemphasized a little but it is also true that a solid understanding of asymptotic complexity and growth is very important if your doing work with huge volumes of information. | ||
|
Blitzkrieg0
United States13132 Posts
On February 16 2019 03:13 travis wrote: well remember guys, a CS degree program is based around theory, it's not necessarily preparing you to be primarily a software engineer. (so, what solidbebe said i guess) But why would that be the case when the majority of the people who get them end up going to work as a software engineer? I feel like most entry level programming jobs are looking for a bachelor's degree as well. I know ours has it listed there. On February 16 2019 03:13 travis wrote: I think it maybe has been overemphasized a little but it is also true that a solid understanding of asymptotic complexity and growth is very important if your doing work with huge volumes of information. I'd say the wrong concepts were emphasized. Performance is more about benchmarking than looking at code and saying this is BigO(n). You obviously need both, but I don't think we did any benchmarking when I was in school and especially using tools to really analyze what methods you are spending the most time in. | ||
|
Deleted User 3420
24492 Posts
On February 16 2019 03:51 Blitzkrieg0 wrote: But why would that be the case when the majority of the people who get them end up going to work as a software engineer? I feel like most entry level programming jobs are looking for a bachelor's degree as well. I know ours has it listed there. So that if you do want to enter a field where it's expected knowledge, then you have it. That said, I don't think you are completely wrong. But I also don't think that big O is THAT over-emphasized. We study a lot of crap.... a lot of it is unlikely to be useful. I'd say the wrong concepts were emphasized. Performance is more about benchmarking than looking at code and saying this is BigO(n). You obviously need both, but I don't think we did any benchmarking when I was in school and especially using tools to really analyze what methods you are spending the most time in. Well, you just said that you may need both! One is particularly useful before you've even started writing the code. As for benchmarking, isn't it kind of language dependent (like, reliant on libraries and testing suites)? It is something interesting that you bring up though, benchmarking is something that should be added to curriculum in some form... | ||
|
LightTemplar
Ireland481 Posts
Whether or not a university course over emphasizes complexity kind of depends on the course. Programming 101 should probably brush on bigO to get the point across that doing something n^2 times isn't a great idea if you can reasonably do it in n. If only because humans are bad at thinking in scale. However your algorithms course should probably start considering the issue in finer detail because the point there is to start considering the application of different structures for different tasks. This should really be a different audience to the 101 though. WRT Project Euler, I found it overly reliant on figuring out the mathmatical problem it was trying to represent. Which while I see value in, I feel isn't quite a typical programming problem set. Good for learning maths concepts and developping algorithms to mirror them though. | ||
|
solidbebe
Netherlands4921 Posts
On February 16 2019 03:51 Blitzkrieg0 wrote: But why would that be the case when the majority of the people who get them end up going to work as a software engineer? I feel like most entry level programming jobs are looking for a bachelor's degree as well. I know ours has it listed there. I guess that's an issue with your education system. In the Netherlands after high school we have 3 different levels of tertiary education: universities (which offer bachelors, masters and Phds), universities of applied sciences (actually called 'high schools' but they are tertiary education. They offer programmes concerned with mostly practical knowledge and little academic theory), and regional education centers (These offer vocational training programmes). The corresponding degree levels are: MBO (vocational training), HBO (a higher level degree but mostly focused on practical knowledge for a job), WO (scientific education, i.e. education to become a scientist). So if you want a degree that prepares you for a job, you can easily get something at MBO or HBO level. Universities should not primarily concern themselves with training people for the job market. Their purpose is to do research and to train people on how to do research. If you want to know more: en.m.wikipedia.org/wiki/Education_in_the_Netherlands | ||
|
SC-Shield
Bulgaria832 Posts
On February 16 2019 04:07 travis wrote: So that if you do want to enter a field where it's expected knowledge, then you have it. That said, I don't think you are completely wrong. But I also don't think that big O is THAT over-emphasized. We study a lot of crap.... a lot of it is unlikely to be useful. Well, you just said that you may need both! One is particularly useful before you've even started writing the code. As for benchmarking, isn't it kind of language dependent (like, reliant on libraries and testing suites)? It is something interesting that you bring up though, benchmarking is something that should be added to curriculum in some form... Big O notation isn't overemphasised, but it's not taught properly enough or I wasn't taught properly at university about it. For example, when lecturers explain Big O notation, they should also take into account processor's optimisation. If you work with a vector (ArrayList in Java and List in C#), it often outperforms linked list when linked list is supposed to be the winner and this could happen due to processor's cache as far as I'm aware. It's just because elements are stored in sequence rather than all over the place in memory like linked list. It just depends on how many elements you have, but I think up to one million or something like that is fine to use a vector even when a linked list is theoretically better. | ||
|
tofucake
Hyrule19159 Posts
| ||
|
SC-Shield
Bulgaria832 Posts
On February 17 2019 01:45 tofucake wrote: That's not what BigO is for. It's strictly for determining the speed of an algorithm implementation without regard to whether it's done with a slide rule and a pencil or on a super computer. So you emphasise on algorithm's speed. My point was that algorithm's speed isn't always the most reliable indicator when you take CPU's optimisations into account. That should be noted in lectures so people aren't misled that Big O is the only way to say if something is faster. | ||
|
Frolossus
United States4779 Posts
On February 17 2019 02:14 SC-Shield wrote: So you emphasise on algorithm's speed. My point was that algorithm's speed isn't always the most reliable indicator when you take CPU's optimisations into account. That should be noted in lectures so people aren't misled that Big O is the only way to say if something is faster. it is not taught wrong. the entire point is to compare relative speeds of algorithms to each other regardless of underlying hardware. the idea is that when operating on large enough data sets hardware becomes less impactful than the algorithm. O(n) is always faster than O(n^2) | ||
|
Simberto
Germany11642 Posts
On February 17 2019 03:14 Frolossus wrote: O(n) is always faster than O(n^2) For very large n. That is an important distinction. The difference in view here is whether you see it from a pure mathematical point of view, or from a practical implementation point of view. | ||
|
tofucake
Hyrule19159 Posts
| ||
|
solidbebe
Netherlands4921 Posts
On February 17 2019 03:14 Frolossus wrote: O(n) is always faster than O(n^2) Not necessarily Lets say algorithm A is O(n). Lets say alg B is O(n^2). The actual definition is, in plain words: There is a value x for which all values higher than x it holds that B has a longer runtime than A. However its perfectly possible that there is a range of inputs for which A takes longer. If algorithm A's runtime approximation is 200000n, and that of alg B is 2n^2, then that alg A will be slower than B for a big range of small numbers. | ||
|
Acrofales
Spain18132 Posts
On February 17 2019 05:47 tofucake wrote: Big O is not for practical application, it's for development and selection of algorithms. Once the math is done and an algorithm selected, implementation is done which is where hardware comes in. Big O is entirely a theoretical tool and that is why it ignores hardware. I know the conversation has moved on, I just want to point out that the original question specifically mentioned the practicality of applying algorithms with different big O complexity. I don't think theoretical complexity analysis is useless, just that the original question was exceptionally badly phrased. | ||
|
Manit0u
Poland17450 Posts
I guess some businesses are really panicking now... @Silvanel: is the Benz GPS ready for this? ![]() | ||
|
spinesheath
Germany8679 Posts
| ||
|
waffelz
Germany711 Posts
On February 19 2019 01:45 spinesheath wrote: That's... I can't really find an excuse for that. I can see how people thought that "we will never need more IP addresses than that", but not "humanity will be extinct by 2019". Probably "this surely will get replaced by something and they will take care of larger dates before this matters" #someoneElseWillFixIt | ||
|
Lmui
Canada6216 Posts
On February 19 2019 06:23 waffelz wrote: Probably "this surely will get replaced by something and they will take care of larger dates before this matters" #someoneElseWillFixIt Well GPS apparently started in 1978, and my assumption is back then, bits are at a premium. The Intel 8086 just released at that time and had 16 bits though, so I don't know what other justification there really is. To be fair, the only device older than 5 years that I use regularly and has a GPS is my car. I'd hope everything newer is updated enough that it doesn't run into the problem. | ||
|
SC-Shield
Bulgaria832 Posts
| ||
|
Silvanel
Poland4733 Posts
On February 18 2019 20:26 Manit0u wrote: https://www.theregister.co.uk/2019/02/12/current_gps_epoch_ends/ I guess some businesses are really panicking now... @Silvanel: is the Benz GPS ready for this? ![]() Its a third party SW (not ours or Daimler's). I guess we will need to update it if it doesn't has a fix in it already. I work for SWDL team so we do that all the time anyway. The real question is whats with cars that are already in clients hands, did the supplier thought about this in advance or not? That can be a problem for pre NTG6 car generations since NTG6+ support remote update. | ||
|
Manit0u
Poland17450 Posts
| ||
| ||