Thread Rules 1. This is not a "do my homework for me" thread. If you have specific questions, ask, but don't post an assignment or homework problem and expect an exact solution. 2. No recruiting for your cockamamie projects (you won't replace facebook with 3 dudes you found on the internet and $20) 3. If you can't articulate why a language is bad, don't start slinging shit about it. Just remember that nothing is worse than making CSS IE6 compatible. 4. Use [code] tags to format code blocks.
On September 03 2015 22:12 Manit0u wrote: Yeah, but I'm afraid of complicated maths affecting database performance when requests (and thus queries) start to ramp up.
So do it in your PHP instead of SQL. I have no idea what is going to be faster. I would, as a general principal, expect PHP to do trigonometry faster than SQL, although you never knows until you test.
I'd have to grab all rows from db (at least lat and long values for each row) and then do the calculations for each one of them. Sounds simple enough but then you have to do all the circus with having a collection, performing calculations for each element, removing elements that didn't match the criteria and returning the collection.
On September 03 2015 22:12 Manit0u wrote: Yeah, but I'm afraid of complicated maths affecting database performance when requests (and thus queries) start to ramp up.
So do it in your PHP instead of SQL. I have no idea what is going to be faster. I would, as a general principal, expect PHP to do trigonometry faster than SQL, although you never knows until you test.
I feel like I would expect SQL to do it faster - I can't see SQL being *slow* at trig, and I feel like the overhead of selecting the ones you need in PHP would more than outweigh the cost of using SQL trig. I have literally no idea though.
On September 03 2015 22:12 Manit0u wrote: Yeah, but I'm afraid of complicated maths affecting database performance when requests (and thus queries) start to ramp up.
So do it in your PHP instead of SQL. I have no idea what is going to be faster. I would, as a general principal, expect PHP to do trigonometry faster than SQL, although you never knows until you test.
I feel like I would expect SQL to do it faster - I can't see SQL being *slow* at trig, and I feel like the overhead of selecting the ones you need in PHP would more than outweigh the cost of using SQL trig. I have literally no idea though.
Yeah, but the exerpt had no where clause, so I (mistakenly) assumed he simply wanted all distances. If there's more to the query then the only way to see speedups is to exhaustively test it.
Other than that, the only way I can think of making that faster is by declaring a MySQL procedure that does the calculation. You should then be able to set a variable to compute:
and store it in a local variable. That way you don't have to compute it twice in the query. However, I don't know what the overhead is of doing that kinda stuff in MySQL.
EDIT: I was going to suggest using cosine law instead of haversine if your distances are large enough, but I checked, and float precision in MySQL is actually quite shockingly low. If precision is important, stick with Haversine. If precision ISN'T important, and a rough estimate is okay, then you can use pythagoras for most stuff unless you are computing distances at very high latitudes. Here's an overview of your options (and the formula for calculation): http://www.movable-type.co.uk/scripts/latlong.html
By taking the arcsin rather than the arctan you run into problems with precision for small distances (somewhere around a meter according to stackexchange). If that's not an issue, carry on. If it is, use the arctan instead:
On September 03 2015 23:56 Acrofales wrote: Yeah, but the exerpt had no where clause, so I (mistakenly) assumed he simply wanted all distances. If there's more to the query then the only way to see speedups is to exhaustively test it.
I kind of forgot to explain the query and blindly posted the DQL.
What it's supposed to do is to return all rows that match the criteria (all locations whose distance from target is equal to or less than given radius in km).
Precision isn't super important (it's used to display all shops/offices whatever within a certain radius).
I did some testing and PHP won't be good for that. Even though you can split it into chunks that are more understandable and easier to debug you run into real trouble when you're trying to parse an array containing thousands of objects.
On September 03 2015 23:56 Acrofales wrote: Yeah, but the exerpt had no where clause, so I (mistakenly) assumed he simply wanted all distances. If there's more to the query then the only way to see speedups is to exhaustively test it.
I kind of forgot to explain the query and blindly posted the DQL.
What it's supposed to do is to return all rows that match the criteria (all locations whose distance from target is equal to or less than given radius in km).
Precision isn't super important (it's used to display all shops/offices whatever within a certain radius).
I did some testing and PHP won't be good for that. Even though you can split it into chunks that are more understandable and easier to debug you run into real trouble when you're trying to parse an array containing thousands of objects.
Any chance you can use Google Services and just geofence it? :D
This kind of thing is better done with a dedicated GIS system than an SQL backend.
Anyway, if approximate results is enough, use pythagoras. Way simpler and faster.
On September 03 2015 23:56 Acrofales wrote: Yeah, but the exerpt had no where clause, so I (mistakenly) assumed he simply wanted all distances. If there's more to the query then the only way to see speedups is to exhaustively test it.
I kind of forgot to explain the query and blindly posted the DQL.
What it's supposed to do is to return all rows that match the criteria (all locations whose distance from target is equal to or less than given radius in km).
Precision isn't super important (it's used to display all shops/offices whatever within a certain radius).
I did some testing and PHP won't be good for that. Even though you can split it into chunks that are more understandable and easier to debug you run into real trouble when you're trying to parse an array containing thousands of objects.
Any chance you can use Google Services and just geofence it? :D
This kind of thing is better done with a dedicated GIS system than an SQL backend.
Anyway, if approximate results is enough, use pythagoras. Way simpler and faster.
Ehh... It's just a client who wants to add stores/whatever to the db. Put in the address, grab lat/long from google and people can then search for nearby stores on his website.
Using a WITH clause (or something like that) to assign RADIANS($latitude) and RADIANS($longitude) to variables would probably help a bit. Also, can't you pass these variables in a safer way (that is, without string concatenation)?
Using a WITH clause (or something like that) to assign RADIANS($latitude) and RADIANS($longitude) to variables would probably help a bit. Also, can't you pass these variables in a safer way (that is, without string concatenation)?
Yeah, I'm going to parametrize that and probably use custom DQL functions for that anyway. The code here is what I took from an old system where they built all their queries like that (concatenate strings and send it as plain SQL). Quite a bit of work before me
On September 04 2015 09:43 Crying wrote: Anyone here interested in P NP problem? What's your thought on it?
Haha sounds like you had your first class in complexity theory, we're all excited about that at first. After a while, "P vs NP" loses a bit of its charm, unfortunately. The most likely answer is that P != NP, and there are literally hundred of other "likely" results that have been demonstrated "assuming P != NP". And people are not really researching things that assume "P = NP", let's be honest here :D. The efforts required to research P vs NP aren't actually worth the rewards. Tentative proofs (all for the != case, of course) are always super long (big fat complicated books) and basically uncheckable. The only exciting thing that could happen in research is if someone actually has an algorithm for reduction, and that it can be checked for correctness. The "almost impossible" thing is if someone actually has a program that does that in reasonable time, then you can think about all the sci-fi/thriller/cataclysm shit you can again!
"P vs NP" being unsolvable is likely due to this classes theory being flawed in the first place. Think about it: it's not really relevant to the real world, where anything longer than cubic is already unmanageable ;D.
If you still are interested, watch the Traveling Salesman. We had a lot of fun with my friends watching that a few years back (we were all computer science graduates obviously, don't watch it with people who don't know CS at all, it's just a bad movie in that case :D).