|
Thread Rules 1. This is not a "do my homework for me" thread. If you have specific questions, ask, but don't post an assignment or homework problem and expect an exact solution. 2. No recruiting for your cockamamie projects (you won't replace facebook with 3 dudes you found on the internet and $20) 3. If you can't articulate why a language is bad, don't start slinging shit about it. Just remember that nothing is worse than making CSS IE6 compatible. 4. Use [code] tags to format code blocks. |
On February 14 2019 04:29 enigmaticcam wrote: Anyone here a regular on Project Euler? Would like to add as friend so I can occasionally ask for advice.
Nope but it looks really fun! I don't have the most free time in the world but I am gonna try to squeeze in a few of the problems for fun.
May not be on your level but I'd be glad to see if you want to bounce some things off me.
Congratulations, the answer you gave to problem 1 is correct.
You are the 824679th person to have solved this problem.
This problem had a difficulty rating of 5%. The highest difficulty rating you had previously solved was 0%. This is a new record. Well done!
like a bawss
prroooobably gonna skip ahead a few difficulties now lol
edit2: oh shit i skipped to a 20% difficulty problem. thought I had it and then I realized that I was supposed to calculate from 1 to 4^i, not 4*i. not completely easy mode it seems
edit3: shiiiiiiit i don't actually know how to do this. well my computer can solve it but it's gonna take probably like a week of running, LOL. clearly some sort of math trick involved here... hmmm coming up with some ideas now if anyone wants to know which one i am doing:
+ Show Spoiler + For every positive number n we define the function streak(n)=k as the smallest positive integer k such that n+k is not divisible by k+1. E.g: 13 is divisible by 1 14 is divisible by 2 15 is divisible by 3 16 is divisible by 4 17 is NOT divisible by 5 So streak(13)=4. Similarly: 120 is divisible by 1 121 is NOT divisible by 2 So streak(120)=1.
Define P(s,N) to be the number of integers n, 1<n<N, for which streak(n)=s. So P(3,14)=1 and P(6,10^6)=14286.
Find the sum, as i ranges from 1 to 31, of P(i,4^i).
edit4: this is hard but for tonight im obsessed
edit5: ok i figured it out there was a math trick. this site for math people lol
|
Yeah, sometimes those difficulty percentages can be a bit misleading, depending on what you already know. One time I managed to solve a 70% problem in 15 minutes, and then there’s this 15% problem that takes me weeks to solve.
I think I remember seeing that one before. I haven’t solved it yet, but maybe I’ll give it a try tomorrow.
My friend code is 508144_4VIozmVBgq6rAzXryXF9r8hFh3xtkaaw, if anyone wants to add me. Pm or post yours and I’ll add you.
Edit: many times I’ve been able to solve problems by simply creating a brute force algorithm with a very low threshold, and just looking for patterns. Sometimes a pattern will be really obvious after just a few low level answers.
|
mine is 1457502_u7RGmPGr1WXWeLl5xRpcfZsRvgwckVns
|
Dang, nice job. I might poke your head about that one.
Edit: Nevermind, I got it too! After playing around in excel I was finally able to find an exploitable pattern.
|
missed a question on a "background knowledge quiz" for one of my classes
doesn't really seem fair... but I want a second opinion before complaining
the question is:
edit: I have to edit the question out so that it doesn't show up if another student does a google search for the question
|
Answer: Big O is a ridiculous concept and no one actually uses it.
|
okay but that's not helpful I need a second opinion on the actual question lol
I clearly think this is a stupid question, or if it's not clear then yes I think this is a dumb question
|
No, the only correct answer is "this question is retarded".
|
im not allowed to select that, i have to select a or b or c or d within the time limit or else i lose points
but since you guys aren't playing, I will just say, I selected C and got it wrong (oops, they must have wanted D)
my justification is that "very large problems" is clearly just an opinion and I have no choice but to guess what that means, but clearly the greatest jump in complexity is between n^2 and 2^n, and furthermore there are numerous algorithms that require n^2 (or actually a little more) to complete, such as matrix multiplication, shortest paths, graph edge betweenness, etc, and they have no alternatives
|
Yup. As I said, the correct answer is "this question is retarded".
Btw, if I asked that question in an exam and a student came up to me with your justification, I'd issue a rectification, scrap that question and acknowledge the question was retarded.
|
okay cool, you gave me what I needed to bother emailing the professor
and if they give me shit i'll tell them acrofales on teamliquid.net says im right
|
On February 15 2019 03:31 travis wrote: okay cool, you gave me what I needed to bother emailing the professor
and if they give me shit i'll tell them acrofales on teamliquid.net says im right Well, I wouldn't say O(n^2) is more right than O(n log n). I'd say they are both potential answers, but that the question is horribly underspecified, and moreover, when faced with "very large" data and real-world problems, nobody worries about the big-O complexity of the algorithm other than in a very abstract way. People don't look at it and say. Oh, that algorithm is O(n^2), it'll never work. They'll just benchmark it. Moreover "practical" is such a stupid metric. In some problems, it's completely fine to have your supercomputer crunch numbers for weeks on end, and others need a solution in real-time. They obviously require completely different approaches.
|
On February 15 2019 03:13 Excludos wrote: Answer: Big O is a ridiculous concept and no one actually uses it. I mean... academics use it. It's a really useful concept, if you understand it and know how/when to apply it.
|
On February 15 2019 04:43 solidbebe wrote:Show nested quote +On February 15 2019 03:13 Excludos wrote: Answer: Big O is a ridiculous concept and no one actually uses it. I mean... academics use it. It's a really useful concept, if you understand it and know how/when to apply it.
Acrofales described above way better than I ever could. It's a theoretical concept which doesn't translate well into real life use. It's really useful on paper only
|
On February 15 2019 05:46 Excludos wrote:Show nested quote +On February 15 2019 04:43 solidbebe wrote:On February 15 2019 03:13 Excludos wrote: Answer: Big O is a ridiculous concept and no one actually uses it. I mean... academics use it. It's a really useful concept, if you understand it and know how/when to apply it. Acrofales described above way better than I ever could. It's a theoretical concept which doesn't translate well into real life use. It's really useful on paper only Im not really sure what it means if something is useful 'on paper only'. What Acrofales describes looks like a perspective from industry. I can imagine big O being of very limited use in industry as you will rarely be developing new algorithms for anything, rather than just using something established. So it's probably correct to state that nobody actually uses it... in industry. To state that big O is a ridiculous concept however is a ridiculous statement. It is quite an important concept within the field of algorithmics, where people actually use it.
*Edit before someone hits me with a 'academics also benchmark their algorithms', yes indeed. Big O is not the only thing you should use to describe an algorithm's performance. That doesn't mean it isn't useful.
|
On February 15 2019 16:56 solidbebe wrote:Show nested quote +On February 15 2019 05:46 Excludos wrote:On February 15 2019 04:43 solidbebe wrote:On February 15 2019 03:13 Excludos wrote: Answer: Big O is a ridiculous concept and no one actually uses it. I mean... academics use it. It's a really useful concept, if you understand it and know how/when to apply it. Acrofales described above way better than I ever could. It's a theoretical concept which doesn't translate well into real life use. It's really useful on paper only Im not really sure what it means if something is useful 'on paper only'. What Acrofales describes looks like a perspective from industry. I can imagine big O being of very limited use in industry as you will rarely be developing new algorithms for anything, rather than just using something established. So it's probably correct to state that nobody actually uses it... in industry. To state that big O is a ridiculous concept however is a ridiculous statement. It is quite an important concept within the field of algorithmics, where people actually use it. *Edit before someone hits me with a 'academics also benchmark their algorithms', yes indeed. Big O is not the only thing you should use to describe an algorithm's performance. That doesn't mean it isn't useful.
I'd say it is overemphasized as a concept that will be useful. You spend a ton of time learning and talking about it, but I've never seen it outside of school except in interviews. We know that academics teach it and it is used in academia. The question I would pose is: Is academia out of touch with industry and emphasizing a concept that most students will never use knowingly or unknowingly?
|
On February 15 2019 16:56 solidbebe wrote: I can imagine big O being of very limited use in industry as you will rarely be developing new algorithms for anything, rather than just using something established. I`ve developed three customer handicapping algorithms. The algorithm assigns a value to each consumer in marketing lists that an org wishes to "rent" for 1 time usage. This helps the org decide whether or not to "rent" a marketing list. I've never used Big O. Big O is theoretical... for theorists. Its a layer of abstraction above a bunch of other layers of abstraction. I will say that Big O makes possible some entertaining mental gymnastics.
On February 15 2019 16:56 solidbebe wrote:rather than just using something established. So it's probably correct to state that nobody actually uses it... in industry. To state that big O is a ridiculous concept however is a ridiculous statement. It is quite an important concept within the field of algorithmics, where people actually use it.
how is it used within Algorithmics ?
|
On February 16 2019 00:30 Blitzkrieg0 wrote:Show nested quote +On February 15 2019 16:56 solidbebe wrote:On February 15 2019 05:46 Excludos wrote:On February 15 2019 04:43 solidbebe wrote:On February 15 2019 03:13 Excludos wrote: Answer: Big O is a ridiculous concept and no one actually uses it. I mean... academics use it. It's a really useful concept, if you understand it and know how/when to apply it. Acrofales described above way better than I ever could. It's a theoretical concept which doesn't translate well into real life use. It's really useful on paper only Im not really sure what it means if something is useful 'on paper only'. What Acrofales describes looks like a perspective from industry. I can imagine big O being of very limited use in industry as you will rarely be developing new algorithms for anything, rather than just using something established. So it's probably correct to state that nobody actually uses it... in industry. To state that big O is a ridiculous concept however is a ridiculous statement. It is quite an important concept within the field of algorithmics, where people actually use it. *Edit before someone hits me with a 'academics also benchmark their algorithms', yes indeed. Big O is not the only thing you should use to describe an algorithm's performance. That doesn't mean it isn't useful. I'd say it is overemphasized as a concept that will be useful. You spend a ton of time learning and talking about it, but I've never seen it outside of school except in interviews. We know that academics teach it and it is used in academia. The question I would pose is: Is academia out of touch with industry and emphasizing a concept that most students will never use knowingly or unknowingly?
I think the notion that a university degree should prepare you for industry is out of touch. A bachelor + masters in computer science teaches you how to be an academic/scientist/researcher in the field of computer science. It does not teach you how to be a developer or work for a company, at least, that is not the main goal. There are other degrees or programmes that do that (at least in the Netherlands). I'm not sure how you were taught about Big O, maybe it was emphasized too much I don't know. However, Big O is a useful concept that is widely used in the field of computer science, particularly algorithmics, cryptography, distributed systems... at least, that's where I've seen it used personally. My experience with industry is very limited but I agree it is probably not a very useful concept for a developer most of the time. The ability to recognize it if you are trying to generally solve the traveling salesman problem for your business solution, and the knowledge that this is not practically doable for large enough problems, can come in very handy though.
On February 16 2019 00:43 JimmyJRaynor wrote:Show nested quote +On February 15 2019 16:56 solidbebe wrote: I can imagine big O being of very limited use in industry as you will rarely be developing new algorithms for anything, rather than just using something established. I`ve developed three customer handicapping algorithms. The algorithm assigns a value to each consumer in marketing lists that an org wishes to "rent" for 1 time usage. This helps the org decide whether or not to "rent" a marketing list. I've never used Big O. Big O is theoretical... for theorists. Its a layer of abstraction above a bunch of other layers of abstraction. I will say that Big O makes possible some entertaining mental gymnastics. Show nested quote +On February 15 2019 16:56 solidbebe wrote:rather than just using something established. So it's probably correct to state that nobody actually uses it... in industry. To state that big O is a ridiculous concept however is a ridiculous statement. It is quite an important concept within the field of algorithmics, where people actually use it.
how is it used within Algorithmics ?
I'm not really sure what your algorithm really does from your description, but yeah: it is perfectly possible to implement useful solutions without explicitly using the concept of Big O. Like I said, most of the time a developer probably doesn't need to even really consider the efficiency of their solution anyways.
I'm not sure what you mean by asking how it's used within algorithmics. If you've come up with a new algorithm and are publishing a paper about it, it is common to provide analysis of its runtime complexity through the Big O, Big Theta, Big Omega, benchmarks, and what have you. These are all properties of algorithms that are very useful to know about. Yes Big O is a theoretical concept, what's wrong with that?
To use a personal example: a while back I was parsing a dataset of about 300k records. For each unique IP address I needed to store relevant information, but multiple records could correspond to the same IP address. After implementing a dumb solution in 5 minutes, where I first made a pass to collect each unique IP, and then for each unique IP collects its corresponding information in another pass, it took more than a minute to run before I stopped it. I realized I had just implemented a solution of which the runtime scaled in the square of the input, when I could very simply just do it in a single pass and get a linear solution. I didn't explicitly think of the Big O notation, but I did basically use it.
|
I use big O for when I see someone implement something that should clearly be O(n) in O(n^2) without any better constant factors or anything. Which happens way too often. Just today I came across 2 more such cases...
I have seen a dictionary lookup implemented in O(n). Really. And that was basically just a wrapper for a proper .Net dictionary.
|
I more meant in the context of the question Travis posted, which has now been deleted so probably harder to discuss. Most of the BigX questions I did in school were based around calculating an exact value whereas in industry you only really care about the simplest questions, is this faster or does this scale if things get big?
It also neglects this business side of things where delivering something if often more important than it being efficient. Writing something inefficiently that runs in the background exactly once isn't going to matter and spending a week writing it efficiently isn't the best solution.
|
|
|
|
|
|