|
Thread Rules 1. This is not a "do my homework for me" thread. If you have specific questions, ask, but don't post an assignment or homework problem and expect an exact solution. 2. No recruiting for your cockamamie projects (you won't replace facebook with 3 dudes you found on the internet and $20) 3. If you can't articulate why a language is bad, don't start slinging shit about it. Just remember that nothing is worse than making CSS IE6 compatible. 4. Use [code] tags to format code blocks. |
On May 21 2015 05:49 sabas123 wrote: Just finished my first week of my intership and holyfuck I never experianced something mentally draining in my live.
I feel just going to sleep after coming back home and do nothing else. Anybody else experianced this? I did an internship of 2 months in the summer after my first year of college in a completely different field than my major, and i experienced the exact same thing. It wasn't programming related, though.. The stuff I needed to do and comprehend wasn't exactly hard (certainly not in hindsight), but the sheer amount of new information, the environment you've never been in before, and some nervousness made it extremely exhausting for the first few weeks. It got better quickly (I'd say 3 weeks, if I recall correctly), but there hasn't been a comparable experience since; everything seems pretty light compared to it. So yeah, I can relate to it.
|
After over a year I'm still experiencing that. I guess it has to do with high mental strain that's simply wearing you down from the inside. Even when it seems there isn't much to do you're thinking about the code 99% of the time and that's hard work for your brain.
And just you wait until you get to work on optimizing stuff... Constantly probing to find the next bottleneck and then trying to find ways of upgrading something that seemed perfectly fine. Was tasked with such a thing this past week and this one. It's awesome to see app execution time go from 22.26s to 1.71s but it's mindwracking.
|
On May 21 2015 08:17 Manit0u wrote: After over a year I'm still experiencing that. I guess it has to do with high mental strain that's simply wearing you down from the inside. Even when it seems there isn't much to do you're thinking about the code 99% of the time and that's hard work for your brain.
Do you find you can also still do other activitys after work? but holyshit even after a year?T_T
|
On May 21 2015 17:00 sabas123 wrote:Show nested quote +On May 21 2015 08:17 Manit0u wrote: After over a year I'm still experiencing that. I guess it has to do with high mental strain that's simply wearing you down from the inside. Even when it seems there isn't much to do you're thinking about the code 99% of the time and that's hard work for your brain.
Do you find you can also still do other activitys after work? but holyshit even after a year?T_T
I guess it depends on your work. I'm doing some charity programming after work for a friend but it's hard (after a whole day of doing stuff you really don't want to come home and do the same thing), I've ceased all after hours freelance work etc. If your work is relatively easy (deadlines are still far off etc.) then it's fine, but as soon as it gets intense and you have some extracurricular activities you absolutely must engage in then you're fucked.
The fact that I have 2 little kids doesn't help me much since after coming home the thing I'd be most interested in doing would be just hitting the couch and watching a movie or performing some other activities that aren't very engaging. Another fact that doesn't really help me is that I'm lucky if I manage to catch 5-6 hours of sleep per day since after I'm done with all the chores, taking care of kids and all that it's the time I can finally do something for myself and it gets late really fast
|
That's normal, you should be tired, programming is mentally draining.
It gets better over time as you get more comfortable with the tools and environment and people, and as you develop more hobbies on the side to keep your energy and expectations up. It should feel different after a month. It might be different for your first internship ever though, I remember sleeping on the bus commute a lot for my first and second. Nowadays I usually have enough energy to do a lot of different stuff after work as long as it's different from my day's work (too much C++ in the day means I'd rather do HTML or something).
|
On May 21 2015 05:49 sabas123 wrote: Just finished my first week of my intership and holyfuck I never experianced something mentally draining in my live.
I feel just going to sleep after coming back home and do nothing else. Anybody else experianced this? Yes. When I started my internship .
You get used to the routine. Fortunately (unfortunately?) I had a 1.5 hour public transport journey each way to sleep on.
|
Some weeks I can't even look at code after work. Other weeks I have random bursts of motivation to work on side projects.
|
On May 22 2015 00:26 Rollin wrote:Show nested quote +On May 21 2015 05:49 sabas123 wrote: Just finished my first week of my intership and holyfuck I never experianced something mentally draining in my live.
I feel just going to sleep after coming back home and do nothing else. Anybody else experianced this? Yes. When I started my internship  . You get used to the routine. Fortunately (unfortunately?) I had a 1.5 hour public transport journey each way to sleep on.  I have a 1.5 hour public transport journey as well But i have to stand up and the busses are really crowded. For around 15 months i did not have the energy to go to gym after work or any side projects. I still miss the 3 months of summer holiday every year i had when i was a student
|
On May 22 2015 02:42 Isualin wrote:Show nested quote +On May 22 2015 00:26 Rollin wrote:On May 21 2015 05:49 sabas123 wrote: Just finished my first week of my intership and holyfuck I never experianced something mentally draining in my live.
I feel just going to sleep after coming back home and do nothing else. Anybody else experianced this? Yes. When I started my internship  . You get used to the routine. Fortunately (unfortunately?) I had a 1.5 hour public transport journey each way to sleep on.  I have a 1.5 hour public transport journey as well  But i have to stand up and the busses are really crowded. For around 15 months i did not have the energy to go to gym after work or any side projects. I still miss the 3 months of summer holiday every year i had when i was a student
Just think of the immense contributions to open source if programmers were given summer vacations
|
*****ing apache thrift, WHO INVENTED THIS - it generates thousand line java files that are just an unreadable black box, thank god online tutorials exist
edit: context switching between matlab, java, and c++ is hard
|
Hi guys, I have a general (and slightly vague) question.
Basically, I want to be able to convert JSON strings into SQL insert statements in a way that is flexible and reusable.
For example, say I have something like this:
{"authors": [ { "author_id":"13", "first_name": "John", "last_name": "Doe", "books": [{ "title": "Capture the flag", "ISBN": "123-456789-12345", },{ "title": "Deathmatch", "ISBN": "123-456789-12346", }] ]}
Then it's pretty easy to parse this JSON into a SQL insert into something like Python with code like this (pseudo code):
for book in json['authors']['books']: sql.execute( "INSERT INTO BOOKS (author_id, book_title, book_isbn) values(" + json['authors']['author_id'] + "," + "'" + book['title'] + "'" + "," + "'" book['ISBN'] + "'" + ");"
so I can store the JSON data in a database for easy retrieval later on. The problem I see with this, however, is that each time I move onto a different problem and encounter a different JSON, I need to rewrite this kind of code all over again and it's becoming really annoying. I want a more flexible solution.
Obviously I am not stupid; I know the computer can't read my mind, so I can't just stuff a JSON in its mouth and expect SQL statement coming out. But I am hoping for a solution in which I can describe the data structure of the JSON (i.e. which key goes to which table) in some sort of way and then have a function that can convert the JSON to SQL.
So my question is whether or not something like this already exists. I am currently working with Python.
If the answer to the question above is "no", then I want to make it happen.
|
Hopefully you're doing this for academic reasons, because it doesn't sound like a great idea :p All I've heard of are Prepared Statements if your DBMS supports it I think, which may be what you're looking for, or not...
Take a look at https://github.com/goodybag/mongo-sql
|
On May 23 2015 11:18 Blisse wrote:Hopefully you're doing this for academic reasons, because it doesn't sound like a great idea :p All I've heard of are Prepared Statements if your DBMS supports it I think, which may be what you're looking for, or not... Take a look at https://github.com/goodybag/mongo-sql
How is it a bad idea? I am just curious.
I want to be able to quickly develop API miner where I can retrieve JSON data and store them in a database. It's really handy to be able to avoid doing repetitive work.
Anyway what you have is interesting! It's not in Python, but that's fine - I'll see what this library does and try to make my own.
|
Because you'd be writing a complicated JSON to SQL mapping mechanism to do nothing really that complicated.
Essentially from what I gather from your requirements, you can just write an application level wrapper around your DBMS with functions like insertBook(author_id, name, title, number), and then just call that function from another class that parses your JSON data. When you get new JSON data, just make a new class that calls the same insertBook API but with different parsing logic.
This sounds like what you mean when you said
"But I am hoping for a solution in which I can describe the data structure of the JSON (i.e. which key goes to which table) in some sort of way and then have a function that can convert the JSON to SQL."
But I feel like you had something more complicated in mind?
What you describe as a "JSON to SQL" converter seems exactly just like another class/wrapper around your DBMS. Turn the JSON into input to functions, and then in the function turn the arguments into a SQL statement. Because the arguments are fixed, you don't have to change your SQL every time. And because JSON is innately very readable, it should be very quick and just a lot of copy & pasting.
What it seems like you're proposing instead, is some JSON parser you have to customize for every new JSON schema you encounter, which sounds like an unnecessarily difficult and unnecessary design, because the time you spend building that customize language model to describe your possible inputs, it should be much easier to write a quick for-loop on the new JSON schema that parses the JSON and calls your insertBook API.
So it's a bad idea unless you're doing it just to learn and see how it goes or how you would do it. Just doesn't seem practical. Or unless you think you'll be doing this >100s of times.
|
On May 23 2015 13:19 Blisse wrote:Because you'd be writing a complicated JSON to SQL mapping mechanism to do nothing really that complicated. Essentially from what I gather from your requirements, you can just write an application level wrapper around your DBMS with functions like insertBook(author_id, name, title, number), and then just call that function from another class that parses your JSON data. When you get new JSON data, just make a new class that calls the same insertBook API but with different parsing logic. This sounds like what you mean when you said Show nested quote +"But I am hoping for a solution in which I can describe the data structure of the JSON (i.e. which key goes to which table) in some sort of way and then have a function that can convert the JSON to SQL." But I feel like you had something more complicated in mind? What you describe as a "JSON to SQL" converter seems exactly just like another class/wrapper around your DBMS. Turn the JSON into input to functions, and then in the function turn the arguments into a SQL statement. Because the arguments are fixed, you don't have to change your SQL every time. And because JSON is innately very readable, it should be very quick and just a lot of copy & pasting. What it seems like you're proposing instead, is some JSON parser you have to customize for every new JSON schema you encounter, which sounds like an unnecessarily difficult and unnecessary design, because the time you spend building that customize language model to describe your possible inputs, it should be much easier to write a quick for-loop on the new JSON schema that parses the JSON and calls your insertBook API. So it's a bad idea unless you're doing it just to learn and see how it goes or how you would do it. Just doesn't seem practical. Or unless you think you'll be doing this >100s of times.
To be perfectly honest, I really may do this often. Some of the JSONs I am working with are really complex. In one particular use case, for example, I need to transform one JSON into inserts into around 12 tables because there are SO MANY different layers in that JSON (I actually skipped some of the data and discarded them because I was too lazy to actually write everything). It's absolutely crazy. Because of this, it also discourages me to start other projects which also involve this kind of highly tedious code writing. For a side project that I am doing on my spare time, it feels absolutely terrible.
|
I don't know exactly what you're doing, but I don't see how you could ever reasonably write a perfectly generic JSON reader. And if it's not perfect then you'd have to parse the JSON yourself anyways, and parsing the JSON manually is EXTREMELY straightforward and easy. All you need to do is feed your parsed data into another function that inserts into the table. I REALLY don't see where the improvements will come from atm because I have no clue what you're parsing or inserting, but in my view from the code you've provided, I don't think you're doing things optimally yourself right now anyways?
def insertBooks(author_id, name, title, isbn): statement = "INSERT INTO BOOKS (author_id, book_title, book_isbn) values ({0}, '{1}', '{2}');".format(int(author_id), title, isbn) sql.execute(statement)
def parse(): for book in json['authors']['books']: insertBooks(json['authors']['author_id'], book['title'], book['isbn'])
if all parse steps look like that it should be really fast to do, but again I don't know what you're dealing with. But again, writing that above took like 2 minutes.
|
On May 23 2015 14:05 Blisse wrote:I don't know exactly what you're doing, but I don't see how you could ever reasonably write a perfectly generic JSON reader. And if it's not perfect then you'd have to parse the JSON yourself anyways, and parsing the JSON manually is EXTREMELY straightforward and easy. All you need to do is feed your parsed data into another function that inserts into the table. I REALLY don't see where the improvements will come from atm because I have no clue what you're parsing or inserting, but in my view from the code you've provided, I don't think you're doing things optimally yourself right now anyways? def insertBooks(author_id, name, title, isbn): statement = "INSERT INTO BOOKS (author_id, book_title, book_isbn) values ({0}, '{1}', '{2}');".format(int(author_id), title, isbn) sql.execute(statement)
def parse(): for book in json['authors']['books']: insertBooks(json['authors']['author_id'], book['title'], book['isbn'])
if all parse steps look like that it should be really fast to do, but again I don't know what you're dealing with. But again, writing that above took like 2 minutes.
For just 3 columns, I agree. But note that your code has 2 fields with quotes and one without quote. Also your code will throw a SQL error if the JSON strings have single quote. Also it doesn't do well when one of the fields is missing. If a field is missing, which ones can we ignore and which ones are severe enough that we need to stop everything? Now imagine there are more than 50 columns, all of which may or may not have their own perks. Do I need to construct all of these cases by hand?
This is why I want a more systematic approach. It makes my work faster and saves time because I will require less time testing. It does seem like an over-design; I don't disagree with you at all. But I kind of see the perspect of this.
|
On May 23 2015 17:10 Sufficiency wrote:Show nested quote +On May 23 2015 14:05 Blisse wrote:I don't know exactly what you're doing, but I don't see how you could ever reasonably write a perfectly generic JSON reader. And if it's not perfect then you'd have to parse the JSON yourself anyways, and parsing the JSON manually is EXTREMELY straightforward and easy. All you need to do is feed your parsed data into another function that inserts into the table. I REALLY don't see where the improvements will come from atm because I have no clue what you're parsing or inserting, but in my view from the code you've provided, I don't think you're doing things optimally yourself right now anyways? def insertBooks(author_id, name, title, isbn): statement = "INSERT INTO BOOKS (author_id, book_title, book_isbn) values ({0}, '{1}', '{2}');".format(int(author_id), title, isbn) sql.execute(statement)
def parse(): for book in json['authors']['books']: insertBooks(json['authors']['author_id'], book['title'], book['isbn'])
if all parse steps look like that it should be really fast to do, but again I don't know what you're dealing with. But again, writing that above took like 2 minutes. For just 3 columns, I agree. But note that your code has 2 fields with quotes and one without quote. Also your code will throw a SQL error if the JSON strings have single quote. Also it doesn't do well when one of the fields is missing. If a field is missing, which ones can we ignore and which ones are severe enough that we need to stop everything? Now imagine there are more than 50 columns, all of which may or may not have their own perks. Do I need to construct all of these cases by hand? This is why I want a more systematic approach. It makes my work faster and saves time because I will require less time testing. It does seem like an over-design; I don't disagree with you at all. But I kind of see the perspect of this.
sqlTypes = { int : "int", string : "varchar(255)", : : }
def insert(table, obj): statement = "INSERT INTO " + table + " ( " for key in obj.keys(): statement += key + " " statement += " ) VALUES ( " for val in obj: statement += val statement += " )" return statement
def create(table, obj): statement = "CREATE TABLE " + table + " IF NOT EXISTS ( " for key in obj.keys(): statement += key + " " + sqlTypes(type(dict[key])) statement += " )" return statement db = MySQLdb.connect("host", "user", "password", "database") cursor = db.cursor()
with open("books.json") as file: objects = json.loads(file.read()) cursor.execute(create("books", objects[0])) for object in objects: cursor.execute(insert("books", object))
Disclaimer: I have no idea what I'm doing with python SQL, I just googled that shit. But isn't this pretty much what you're going for? Create a table with columns for each element of a dictionary, then insert the values for a bunch of those dictionaries into said table? Obviously needs significantly more error checking and some escaping and stuff but you can always pay me if you want me to write real code but really it doesn't seem like a problem that's that difficult, if I actually wanted to do this I would almost definitely spend a few hours making it work decently.
|
I'd just recommend using a differente type of database. Like a Document database. That way jou can store the JSON as... json. Easy. No mapping required.
|
On May 23 2015 19:27 supereddie wrote: I'd just recommend using a differente type of database. Like a Document database. That way jou can store the JSON as... json. Easy. No mapping required.
Yeah. You can do that in SQL by simply storing your JSON as either VARCHAR (if you don't need special characters), NVARCHAR (if you need special characters, but this limits the string to 4000 characters unless you do NVARCHAR(MAX), which can store strings up to 2GB) or a BLOB. You can simply save/retrieve the JSON and parse it in your app, which is a much better approach since you can change the data being stored without having to update your database in the slightest (as in, not having to worry about adding/removing fields or changing their type).
|
|
|
|