Thread Rules 1. This is not a "do my homework for me" thread. If you have specific questions, ask, but don't post an assignment or homework problem and expect an exact solution. 2. No recruiting for your cockamamie projects (you won't replace facebook with 3 dudes you found on the internet and $20) 3. If you can't articulate why a language is bad, don't start slinging shit about it. Just remember that nothing is worse than making CSS IE6 compatible. 4. Use [code] tags to format code blocks.
Man... I've started working for a pretty big company recently and even though I'm excited about what I'll be doing I'm also super disappointed in some stuff.
Company: What kind of computer and OS do you want to work with? Me: Linux. Company: Psyche! We'll send you a Mac anyway.
Not only that but also it's been almost 2 weeks and I can't even use it since first they forgot to provide me with credentials for their internals and for a week now IT department didn't get to resetting my password so I could actually use the laptop and start working... I guess the downside of working for really big companies is that there's a ton of red tape and apparently simple things take forever.
Yep! Big companies have a lot of friction in the wheels. They also seem to have so much bureaucracy that has to go up so many levels of admin for literally no reason.
You'll find when you submit your first major change it will go to the steering committee or similar. You'll have to wait a week or 2 to put your change into production. Why, are they going to request a change or something? Are they going to smooth out conflicts with other deployments? Nope! They just want to know what's going into production. Are they going to let people know what's going in? Again, nope! They just want to know for knowledge's sake, and they're going to kill your productivity just for the appearance of being useful.
Yeah, that probably depends a bit on the structure of the company though.
I work for a very large company, but the management structure is wide instead of deep. As a developer there's only a couple hops between me and the head of my business unit / product group.
I find larger changes can still be hard to push through, but mostly because the organization is somewhat risk averse and has to spend development time carefully. So you end up doing a little more work upfront to pitch the value of whatever you want to change to the architecture group and then might have to wait a bit before it can be prioritised for a team to work on. This is a symptom of 'too much to do, too few people to do it' more than anything.
That's on the development side. Now have to interact with centralized global IT, HR, payroll, etc.? Good luck! That's always been a huge pain point for me.
Yeah, that's probably a bit more of a tech-heavy company issue, where people understand how to work with tech, and the company has good established processes to release value at high velocity with low defects.
I'm a consultant and I've worked at some of those sorts of companies before. They can be hectic and demanding, because they have high standards, but they're also enjoyable to work for because you're not spending your time on stupid stuff that could be automated or simply not an issue with better processes.
I'm talking more about the legacy large companies that haven't modernized, where you might submit your change, and there's no automated testing, so you need to wait 2 weeks for the QA team in India to approve it, then you need to submit your changelist to the staging environment, which takes another week, then you need to deploy to production which takes another week. All of these steps are manual, which they don't benefit from, and they all require sign off.
So you end up with 4 weeks after your changes are done before it's in Production. It's the exact opposite issue of "too much to do, too few to do it". It was nice for when I started my career because I would read about programming and practice in the tons of free time. When you're a bit further along and you would like to actually accomplish things it can be very painful to have a 4 week cycle.
On November 13 2020 20:00 Manit0u wrote: Not only that but also it's been almost 2 weeks and I can't even use it since first they forgot to provide me with credentials...
sounds like a good topic for a dilbert cartoon.
if you have nothing to do. start making your own product. whatever need you can see... fill it. i have an icon management tool and a report builder on the go. its a great way to make money on the side and possibly replace the income from your full time employer.
On November 12 2020 03:29 Manit0u wrote: POODR is actually a really good book. In any case, most of the problems with OOD/OOP stem from the fact that it was implemented wrongly along the way (basically, they used classes when they wanted modules). It doesn't help that most languages that are self-prophetized OOP languages are not so (Java, I'm looking at you).
Now, if you want to really grasp OOP you should take a true OOP language like Ruby, where everything is an object and there are no primitives (null is also an object in Ruby, as are true and false, which aren't even of the Boolean class). Ruby is also using modules heavily and to a great effect allowing it to perform some true "magic" that you simply cannot achieve with other languages (at least not the popular ones, I think you can do anything in Lisp since according to the data I managed to find it's the most feature-rich language in existence).
good points.
This is a decent diagnosis of the problem. These are factors contributing to the problem. However, I think there is another major contributing factor. Something I call "A lack of mathematical maturity" with the technologies the designers use. I'll provide an extreme case to make my point. In one programming language and database platform I have 17 years experience or more than half my life. I have more than 12,000 hours of cumulative work time into the language and database platform. Its no surprise that my biggest leaps in improvement in the field of OOD and OOP originate from the technologies I've used the most. I'd say other developers follow the same trend I experience. Their best work is done in the dev environment where they are most comfortable/familiar.
Many developers never attain 5+ years and 5000 hours focused work time in a specific language on a specific platform. This makes it harder for them to get solid with OOD and OOP.
On November 13 2020 10:06 tofucake wrote: you may not need the coding interview book at all. I've never had an interview where I had to code that was for an actually good job (one I did have was for a job crafting email campaigns, yuck). The "coding" in interviews is never anything practical that you'll do on the job, they are just the second line of defense against people who bullshitted through the phone screen. If you can program, you generally don't need to worry about it.
Things are different for SV but SV sucks and all their "coding interviews" just throw leetcode problems at you.
Even though I agree with you that the coding interviews are often dumb you will run into them often.
Cracking the Coding Interview does help you prepare for them by giving you more of a "streets-smart" honing of your knowledge, with lots of focus on things like different types of Lists, polymorphism, Maps, and a bit more of an understanding on how to apply these different things. I think the most useful thing that it taught me was that if you get asked to implement just about anything efficiently the answer will almost always be to use a Hash Map.
On November 13 2020 08:44 WombaT wrote: Hi folks, quite early in my programming journey (yay undergrad number 2). Things are still rudimentary level wise currently, looking forward to clearing this semester and pushing on, really quite into it thus far and have some personal projects I’d like to do.
One thing I’ve found thus far is I learn way, way better from textbooks than online tutorials and the likes. I hadn’t quite expected this but whether it’s how the information is organised, the medium, or keeping me away from distractions it’s definitely providing more effective for me thus far.
Was wondering if you guys had any weighty textbooks you’d recommend, I think I’ll only be using Python, Java and C in the foreseeable future but open to other languages if there’s a good reason (heard Ruby is quite rigid in terms of OOP so is quite good to refine your OOP chops for example), or just more general books that you all swear by.
Thanks in advance!
Here's some of the textbooks I found most useful through undergrad to where I am now. I work in systems programming doing a fair bit of low level driver development and network programming, in an application domain where you also need strong knowledge of distributed systems.
A second vote for CLRS. For learning algorithms it's the go-to.
Operating Systems Concepts by Silberschatz. I've found having a strong knowledge of systems and how the OS interacts with programs to be very useful. I ended up reading this cover-to-cover as part of my operating systems course, and I consider it one of the most useful courses I took. I've also heard good things about the Tanenbaum OS book.
Computer Architecture: A Quantitative Approach by Hennessy and Patterson. I find the way this book is presented is very good in that design decisions are contrasted and compared empirically. So in addition to giving more strong fundamentals in architecture, it also shows a good approach for how to evaluate and design systems.
If you have an interest in reinforcement learning, I'd recommend Reinforcement Learning: An Introduction by Sutton and Barto. This is a very good introduction, and is complementary to other statistical machine learning books if you're interested in the area.
If you're interested in learning more about theory, I'd recommend Introduction to the Theory of Computation by Sipser. This is a pretty standard textbook in the area and should be complementary to the CLRS treatment of this topic.
For interview style questions or just for fun, I really like Competitive Programming by Steven and Felix Halim. It covers a range of algorithms types and also presents full code examples to solve them that are designed to be time-efficient to implement. I did 'competitive programming' for fun during university, so basically programming solutions to algorithms questions under time constraints. I found the skills I developed doing that had a fair amount of crossover with what I consider to be a common (but poor) interview style of throwing algorithms questions at candidates.
Otherwise for books that I use more as a reference now and then but haven't read in full I like:
Advanced Programming in the Unix Environment by Stevens TCP/IP Illustrated, also by Stevens The Linux Programming Interface by Kerrisk
Is the Computer Architecture one about the foundational model of computing hardware architecture? I.e. what you would find in 3rd year university? I had a course on Architecture in which we learned MIPS assembly, the stack, the heap, etc. and it helped me a ton in my understanding of how computers work and why we see these different efficiency issues, how to debug a stack properly, etc. To anyone interested in becoming a really programmer I would recommend learning architecture and doing a small amount of assembly, though it probably wouldn't be needed if you're just doing frontend.
In general I stayed away from recommending more niche books like the ones you recommend on OSes, Reinforcement Learning, etc. since those tend to be deeper dives on something you can put on your toolbox, whereas I tried to focus on those which cover a wider amount of the field, i.e. Data Structures and Algorithms. It's hard as a 2nd semester student to be thinking about something like OSes when you don't even know how a processor works, for example. Generally I would expect a student to start choosing a niche more towards the end of 2nd year or in 3rd year, at which point learning OSes is great, if that's what you chose to go towards.
On November 13 2020 06:59 WarSame wrote: Hmm, I'm honestly having a hard time picturing the scenarios you're talking about since I've never worked in any super response time heavy industries. I imagine you would mean that they cannot afford any abstractions at all and are written compiled, very close to the metal. So I would imagine you're talking about some CPP or Rust or C system where that low level of control is required.
I actually did work in banking for a bit using C but the response time requirements were not there. I don't even think they tested for it, as long as manual testing wasn't ridiculously slow.
Well, I was working in such place last year and they were not using C that much. There was some C++ but it wasn't much (mostly for some server-side calculations) since it all ran over the internet with plenty of microservices and third-party integrations (there was everything there, C++, Python, Ruby, Scala, Java, Swift, what have you). Sure, you could write it all in assembly if you really wanted but the thing is that some pieces of this software have to be worked on by different people over the course of many years - constantly being upgraded to newer standards, language versions etc. (technical debt is company killer after all) so there's still pretty high level of abstraction. You just strip it away in some bottleneck parts (you either strip away abstractions there or you strip away this entire functionality and move it to some different tech stack altogether).
I agree with you overall, that you can use any languages, especially when using microservices.
However, I would argue that anything which uses microservices, third-party integrations, different languages where some are Garbage Collected, etc. is not a super response time dependent system.
Microservices mean you suffer the TCP/IP overhead for each bounce, which I think is a few ms for Time To First Byte.
Third party integrations mean TCP/IP, security and networking layers, and uncontrollable and often unreliable dependencies.
Different languages, especially with GC, increase variability and decrease speed.
So I would say if you can use those things then you don't really fit into what I picture as super response time dependency.
On a related note, I did some work where a company had split their 3 servers and their DB from all being on the same computer, and saw a HUGE hit to their response times and a large amount of data being transferred.
I investigated and found their huge latency was because they did ~70 round trips to the DB to build the page. A TCP TTFB isn't much, but when you do 70 round trips it really adds up. They also were doing * queries for some fairly large objects (~90 fields), for about half of those trips.
They hadn't noticed any of this when all were on the same server because the round trip time is ~0ms and there is no data transfer over the wire.
My suggestion was to use a Single Page Application or something, to reduce the ~70 items being fetched down to something manageable like 5. I also obviously suggested not to use * queries. They've started working on it but I haven't heard back how it's gone.
On November 13 2020 06:59 WarSame wrote: Hmm, I'm honestly having a hard time picturing the scenarios you're talking about since I've never worked in any super response time heavy industries. I imagine you would mean that they cannot afford any abstractions at all and are written compiled, very close to the metal. So I would imagine you're talking about some CPP or Rust or C system where that low level of control is required.
I actually did work in banking for a bit using C but the response time requirements were not there. I don't even think they tested for it, as long as manual testing wasn't ridiculously slow.
Well, I was working in such place last year and they were not using C that much. There was some C++ but it wasn't much (mostly for some server-side calculations) since it all ran over the internet with plenty of microservices and third-party integrations (there was everything there, C++, Python, Ruby, Scala, Java, Swift, what have you). Sure, you could write it all in assembly if you really wanted but the thing is that some pieces of this software have to be worked on by different people over the course of many years - constantly being upgraded to newer standards, language versions etc. (technical debt is company killer after all) so there's still pretty high level of abstraction. You just strip it away in some bottleneck parts (you either strip away abstractions there or you strip away this entire functionality and move it to some different tech stack altogether).
Microservices mean you suffer the TCP/IP overhead for each bounce, which I think is a few ms for Time To First Byte.
Third party integrations mean TCP/IP, security and networking layers, and uncontrollable and often unreliable dependencies.
Different languages, especially with GC, increase variability and decrease speed.
So I would say if you can use those things then you don't really fit into what I picture as super response time dependency.
You would be surprised Plenty of time dependent systems are using microservice architecture (some of those systems are simply too large and too complex to have them in a single place) and a lot of them are using sub-optimal languages when it comes to speed - I've worked for 2 very big companies now that use Ruby at the core of their app, which is pretty slow and non-concurrent but the ease of development and ability to apply changes quickly is too good of a benefit.
But that's also where real hardcore stuff starts since then you design your systems to run almost entirely in the cloud, most of microservices being lambdas and you're using many layers of heavy caching (where around 99% of your traffic is hitting the cache).
Imagine that: 1. Ruby app 2. ~80 million requests/day 3. over 4.5 billion users in the db 4. delivering ads in average time of ~150ms where every request has to go through several services and load balancers, fetch ads from the marketplace, filter ads (age, country, already seen, other rules), check for fraud, validate user, check events with Kafka etc. etc.
It is doable, but it's not easy and usually you find your bottleneck being the dbs (Redis was too slow for the app mentioned above for simple things so they used AeroSpike instead). In general, for 99% of use cases you don't need high-performant and concurrent languages unless your code has to do some serious computations (this system I mentioned had 2 or 3 microservices using C++ for that) because it's a very long road to get to a point where code execution speed is your primary concern.
On November 13 2020 10:06 tofucake wrote: you may not need the coding interview book at all. I've never had an interview where I had to code that was for an actually good job (one I did have was for a job crafting email campaigns, yuck). The "coding" in interviews is never anything practical that you'll do on the job, they are just the second line of defense against people who bullshitted through the phone screen. If you can program, you generally don't need to worry about it.
Things are different for SV but SV sucks and all their "coding interviews" just throw leetcode problems at you.
Even though I agree with you that the coding interviews are often dumb you will run into them often.
Cracking the Coding Interview does help you prepare for them by giving you more of a "streets-smart" honing of your knowledge, with lots of focus on things like different types of Lists, polymorphism, Maps, and a bit more of an understanding on how to apply these different things. I think the most useful thing that it taught me was that if you get asked to implement just about anything efficiently the answer will almost always be to use a Hash Map.
On November 13 2020 08:44 WombaT wrote: Hi folks, quite early in my programming journey (yay undergrad number 2). Things are still rudimentary level wise currently, looking forward to clearing this semester and pushing on, really quite into it thus far and have some personal projects I’d like to do.
One thing I’ve found thus far is I learn way, way better from textbooks than online tutorials and the likes. I hadn’t quite expected this but whether it’s how the information is organised, the medium, or keeping me away from distractions it’s definitely providing more effective for me thus far.
Was wondering if you guys had any weighty textbooks you’d recommend, I think I’ll only be using Python, Java and C in the foreseeable future but open to other languages if there’s a good reason (heard Ruby is quite rigid in terms of OOP so is quite good to refine your OOP chops for example), or just more general books that you all swear by.
Thanks in advance!
Here's some of the textbooks I found most useful through undergrad to where I am now. I work in systems programming doing a fair bit of low level driver development and network programming, in an application domain where you also need strong knowledge of distributed systems.
A second vote for CLRS. For learning algorithms it's the go-to.
Operating Systems Concepts by Silberschatz. I've found having a strong knowledge of systems and how the OS interacts with programs to be very useful. I ended up reading this cover-to-cover as part of my operating systems course, and I consider it one of the most useful courses I took. I've also heard good things about the Tanenbaum OS book.
Computer Architecture: A Quantitative Approach by Hennessy and Patterson. I find the way this book is presented is very good in that design decisions are contrasted and compared empirically. So in addition to giving more strong fundamentals in architecture, it also shows a good approach for how to evaluate and design systems.
If you have an interest in reinforcement learning, I'd recommend Reinforcement Learning: An Introduction by Sutton and Barto. This is a very good introduction, and is complementary to other statistical machine learning books if you're interested in the area.
If you're interested in learning more about theory, I'd recommend Introduction to the Theory of Computation by Sipser. This is a pretty standard textbook in the area and should be complementary to the CLRS treatment of this topic.
For interview style questions or just for fun, I really like Competitive Programming by Steven and Felix Halim. It covers a range of algorithms types and also presents full code examples to solve them that are designed to be time-efficient to implement. I did 'competitive programming' for fun during university, so basically programming solutions to algorithms questions under time constraints. I found the skills I developed doing that had a fair amount of crossover with what I consider to be a common (but poor) interview style of throwing algorithms questions at candidates.
Otherwise for books that I use more as a reference now and then but haven't read in full I like:
Advanced Programming in the Unix Environment by Stevens TCP/IP Illustrated, also by Stevens The Linux Programming Interface by Kerrisk
Is the Computer Architecture one about the foundational model of computing hardware architecture? I.e. what you would find in 3rd year university? I had a course on Architecture in which we learned MIPS assembly, the stack, the heap, etc. and it helped me a ton in my understanding of how computers work and why we see these different efficiency issues, how to debug a stack properly, etc. To anyone interested in becoming a really programmer I would recommend learning architecture and doing a small amount of assembly, though it probably wouldn't be needed if you're just doing frontend.
In general I stayed away from recommending more niche books like the ones you recommend on OSes, Reinforcement Learning, etc. since those tend to be deeper dives on something you can put on your toolbox, whereas I tried to focus on those which cover a wider amount of the field, i.e. Data Structures and Algorithms. It's hard as a 2nd semester student to be thinking about something like OSes when you don't even know how a processor works, for example. Generally I would expect a student to start choosing a niche more towards the end of 2nd year or in 3rd year, at which point learning OSes is great, if that's what you chose to go towards.
Yes, the architecture text book was used in a fourth year split undergraduate/graduate level course. There wasn't much assembly involved other than for understanding how instruction decoding, dispatch, etc. worked. The course and textbook were much more focused on CPU architecture and the design decisions there along with how they would effect the running of programs. So an assignment might be something like implementing several different branch prediction strategies in a CPU simulator and seeing how that effects prediction rate and overall performance on a few different workloads/programs.
We had two earlier architecture courses available as well. One focused more on understanding the basic pieces of computer architecture and assembly programming, and another that focused more on hardware implementation and digital circuits using a hardware description language.
The OS books are actually quite general and don't focus on any particular operating system or kernel. This was a third year course where subjects like process management, threads, memory subsystems, system call interfaces, kernel/userspace split, and such were taught more in depth than the surface knowledge needed for something like an introductory course using C or C++.
I think this sort of OS knowledge is pretty fundamental. Being able to understand the trade-offs between multiprocessing vs multithreading, process scheduling, memory management concerns such as working set size, page misses, cache friendliness, and fragmentation, trade-offs between different sorts of interprocess communication, and similar subjects is something I find generically useful across a wide range of applications.
This is mostly coloured by working in systems programming. I believe this is probably still useful to know otherwise, but I can't personally vouch for how applicable it is in other sorts of roles or application domains. I'm putting it out there as what I personally found most useful coming out my program for what I do now.
@Warsame and Mr Wiggles thanks very much, that’s exactly the kind of stuff I’m looking for. I actually have Clean Code and it seems very good, the kind of thing that will reap dividends and value over time, like getting your 1/1 upgrades. At present I’m very much in the weird cheese phase of a SC game and just trying desperately to survive in the short term.
I’m finding my current juggling act rather tricky, still have to work to finance studying, give up a minimum day a week to hang with kiddo, often more and have to keep my grades up too. Plus my mental health isn’t great in lockdown number 2 either.
On the plus side I do really enjoy the subject, it’s not a chore at all I like the problem solving so really I’m trying to keep things together until the end of the semester and really push forward independently when I have a few weeks break.
On November 13 2020 10:06 tofucake wrote: you may not need the coding interview book at all. I've never had an interview where I had to code that was for an actually good job (one I did have was for a job crafting email campaigns, yuck). The "coding" in interviews is never anything practical that you'll do on the job, they are just the second line of defense against people who bullshitted through the phone screen. If you can program, you generally don't need to worry about it.
Things are different for SV but SV sucks and all their "coding interviews" just throw leetcode problems at you.
Even though I agree with you that the coding interviews are often dumb you will run into them often.
Cracking the Coding Interview does help you prepare for them by giving you more of a "streets-smart" honing of your knowledge, with lots of focus on things like different types of Lists, polymorphism, Maps, and a bit more of an understanding on how to apply these different things. I think the most useful thing that it taught me was that if you get asked to implement just about anything efficiently the answer will almost always be to use a Hash Map.
On November 13 2020 13:45 Mr. Wiggles wrote:
On November 13 2020 08:44 WombaT wrote: Hi folks, quite early in my programming journey (yay undergrad number 2). Things are still rudimentary level wise currently, looking forward to clearing this semester and pushing on, really quite into it thus far and have some personal projects I’d like to do.
One thing I’ve found thus far is I learn way, way better from textbooks than online tutorials and the likes. I hadn’t quite expected this but whether it’s how the information is organised, the medium, or keeping me away from distractions it’s definitely providing more effective for me thus far.
Was wondering if you guys had any weighty textbooks you’d recommend, I think I’ll only be using Python, Java and C in the foreseeable future but open to other languages if there’s a good reason (heard Ruby is quite rigid in terms of OOP so is quite good to refine your OOP chops for example), or just more general books that you all swear by.
Thanks in advance!
Here's some of the textbooks I found most useful through undergrad to where I am now. I work in systems programming doing a fair bit of low level driver development and network programming, in an application domain where you also need strong knowledge of distributed systems.
A second vote for CLRS. For learning algorithms it's the go-to.
Operating Systems Concepts by Silberschatz. I've found having a strong knowledge of systems and how the OS interacts with programs to be very useful. I ended up reading this cover-to-cover as part of my operating systems course, and I consider it one of the most useful courses I took. I've also heard good things about the Tanenbaum OS book.
Computer Architecture: A Quantitative Approach by Hennessy and Patterson. I find the way this book is presented is very good in that design decisions are contrasted and compared empirically. So in addition to giving more strong fundamentals in architecture, it also shows a good approach for how to evaluate and design systems.
If you have an interest in reinforcement learning, I'd recommend Reinforcement Learning: An Introduction by Sutton and Barto. This is a very good introduction, and is complementary to other statistical machine learning books if you're interested in the area.
If you're interested in learning more about theory, I'd recommend Introduction to the Theory of Computation by Sipser. This is a pretty standard textbook in the area and should be complementary to the CLRS treatment of this topic.
For interview style questions or just for fun, I really like Competitive Programming by Steven and Felix Halim. It covers a range of algorithms types and also presents full code examples to solve them that are designed to be time-efficient to implement. I did 'competitive programming' for fun during university, so basically programming solutions to algorithms questions under time constraints. I found the skills I developed doing that had a fair amount of crossover with what I consider to be a common (but poor) interview style of throwing algorithms questions at candidates.
Otherwise for books that I use more as a reference now and then but haven't read in full I like:
Advanced Programming in the Unix Environment by Stevens TCP/IP Illustrated, also by Stevens The Linux Programming Interface by Kerrisk
Is the Computer Architecture one about the foundational model of computing hardware architecture? I.e. what you would find in 3rd year university? I had a course on Architecture in which we learned MIPS assembly, the stack, the heap, etc. and it helped me a ton in my understanding of how computers work and why we see these different efficiency issues, how to debug a stack properly, etc. To anyone interested in becoming a really programmer I would recommend learning architecture and doing a small amount of assembly, though it probably wouldn't be needed if you're just doing frontend.
In general I stayed away from recommending more niche books like the ones you recommend on OSes, Reinforcement Learning, etc. since those tend to be deeper dives on something you can put on your toolbox, whereas I tried to focus on those which cover a wider amount of the field, i.e. Data Structures and Algorithms. It's hard as a 2nd semester student to be thinking about something like OSes when you don't even know how a processor works, for example. Generally I would expect a student to start choosing a niche more towards the end of 2nd year or in 3rd year, at which point learning OSes is great, if that's what you chose to go towards.
I think this sort of OS knowledge is pretty fundamental. Being able to understand the trade-offs between multiprocessing vs multithreading, process scheduling, memory management concerns such as working set size, page misses, cache friendliness, and fragmentation, trade-offs between different sorts of interprocess communication, and similar subjects is something I find generically useful across a wide range of applications.
Ah I see why you were recommending it then. I had these topics covered across my hardware architecture and distributed processing courses, but you're right that these are very important topics for anyone working on the backend.
On November 15 2020 10:07 WombaT wrote: @Warsame and Mr Wiggles thanks very much, that’s exactly the kind of stuff I’m looking for. I actually have Clean Code and it seems very good, the kind of thing that will reap dividends and value over time, like getting your 1/1 upgrades. At present I’m very much in the weird cheese phase of a SC game and just trying desperately to survive in the short term.
I’m finding my current juggling act rather tricky, still have to work to finance studying, give up a minimum day a week to hang with kiddo, often more and have to keep my grades up too. Plus my mental health isn’t great in lockdown number 2 either.
On the plus side I do really enjoy the subject, it’s not a chore at all I like the problem solving so really I’m trying to keep things together until the end of the semester and really push forward independently when I have a few weeks break.
Thanks again folks!
It's a pleasure to help!
The situation definitely doesn't sound easy, especially with a kid in the picture. I was privileged to be able to attend University full time without worrying about the finances, living with my parents(who are great to get along with), and with getting to hang out with my friends and family, and play sports. At the times in my life when that hasn't been the case my grades have dropped sometimes by 20%.
The lockdowns are frankly going to be huge mental health killers. I would not blame you in the slightest if you start struggling to manage your course load. Generally profs should be fairly understanding about giving you allowances for late submissions, though they may also be overwhelmed now. It could be worth considering delaying some of your courses until after the pandemic ends for your own sanity. There's no point in rushing it if you can't enjoy it at the other end.
The subject is amazing for anyone with a more problem solving or system oriented brain, which is why there's so damn many nerds in it
When is the end of your semester? There may be resources to pause some of your courses if you're finding it a bit much, especially if they're online now. I found when I couldn't focus fully on my courses my grades would suffer and my complete understanding of the subject would suffer even more.
On November 14 2020 06:45 Manit0u wrote: If you like hearing about some high-level concepts and would like to know more about the history of OOP there's a pretty neat presentation:
I like how the entire presentation went from talking about FP and turned into a deep dive into the history of OOP practically right off the bat.
thanks for posting. as much as i love the craft of building software solutions.... i love the history of software almost as much.
i find the historical accounts of the competition between the Apple 2 and the Commodore 64 to be fascinating. The Commodore 64 became a video game system with a keyboard while the Apple 2 became the first step towards the modern PC.
This reminds me that I should have some parts of my first PC ever - 256Kb RAM, 4MB HDD. Need to check if I still have the mobo for it and maybe I could get it to run (no longer have CRT monitor but maybe I can somehow connect the newer ones to it). Maybe I'll even find some 5.25'' floppies! Wonder what could be on them... Back in the day I was too young and too noob to actually use them.
I'll wait a year before I start to promise to anyone that anything I make will work on a Mac.
On November 15 2020 12:44 WarSame wrote: The situation definitely doesn't sound easy, especially with a kid in the picture. I was privileged to be able to attend University full time without worrying about the finances, living with my parents(who are great to get along with), and with getting to hang out with my friends and family, and play sports. At the times in my life when that hasn't been the case my grades have dropped sometimes by 20%.
The lockdowns are frankly going to be huge mental health killers.
Generally speaking, i don't think this kind of comfortable , cozy environment you describe is the best for growing into a well rounded professional software maker. I notice you're in Canada. The University of Waterloo produces top notch software engineers by the truckload and the environment is the opposite of the conditions you describe. Most students live in three or four cities over four years. They do not live with their parents, and they move a dozen times in four years.
On November 14 2020 06:45 Manit0u wrote: If you like hearing about some high-level concepts and would like to know more about the history of OOP there's a pretty neat presentation:
I like how the entire presentation went from talking about FP and turned into a deep dive into the history of OOP practically right off the bat.
thanks for posting. as much as i love the craft of building software solutions.... i love the history of software almost as much.
i find the historical accounts of the competition between the Apple 2 and the Commodore 64 to be fascinating. The Commodore 64 became a video game system with a keyboard while the Apple 2 became the first step towards the modern PC.
You might also enjoy this talk (delivered by industry legend):
Black hole computers, git-torrent and other cool ideas.
I'll wait a year before I start to promise to anyone that anything I make will work on a Mac.
.Net programming has been an option for Linux and Mac developers for years using the open source Mono Platform. Mono is a community supported implementation of the.Net Framework based on the ECMA standards for C# and the Common Language Runtime. And recently Microsoft has been collaborating directly with the Mono team on improvements. Furthermore, Microsoft just rolled out .Net Core 1.0, a streamlined, cross-platform version of the .Net Framework. This release also includes ASP.Net Core, which is Microsoft’s .Net web application platform, and Entity Framework Core, which is a .Net Object Relational Mapper, making it easy to connect your .Net application to the database of your choice. These platforms are fully backed by Microsoft and receive regular updates, making it easy to build on and deploy to the operating system of your choice.
I mean, sure, WPF and such won't work on every system, but those things are platform specific after all. Didn't work that much with .Net but I remember it working with GTK and other stuff like that for cross-platform desktop GUI (first money I've ever made in programming was a C# desktop app that I wrote on Linux and it was running on Windows).
On November 18 2020 08:43 Manit0u wrote: I mean, sure, WPF and such won't work on every system, but those things are platform specific after all. Didn't work that much with .Net but I remember it working with GTK and other stuff like that for cross-platform desktop GUI (first money I've ever made in programming was a C# desktop app that I wrote on Linux and it was running on Windows).
i'd like it for my .NET desktop apps working on a Mac. The customers i have using linux.. use it as an OS on a server. Everyone uses Win10 and whatever the current MacOS is for their desktop OS.
On November 15 2020 12:44 WarSame wrote: The situation definitely doesn't sound easy, especially with a kid in the picture. I was privileged to be able to attend University full time without worrying about the finances, living with my parents(who are great to get along with), and with getting to hang out with my friends and family, and play sports. At the times in my life when that hasn't been the case my grades have dropped sometimes by 20%.
The lockdowns are frankly going to be huge mental health killers.
Generally speaking, i don't think this kind of comfortable , cozy environment you describe is the best for growing into a well rounded professional software maker. I notice you're in Canada. The University of Waterloo produces top notch software engineers by the truckload and the environment is the opposite of the conditions you describe. Most students live in three or four cities over four years. They do not live with their parents, and they move a dozen times in four years.
Why not? I don't see how moving many times and spending a lot of your time and effort doing so will make you into a better developer.