• Log InLog In
  • Register
Liquid`
Team Liquid Liquipedia
EST 13:22
CET 19:22
KST 03:22
  • Home
  • Forum
  • Calendar
  • Streams
  • Liquipedia
  • Features
  • Store
  • EPT
  • TL+
  • StarCraft 2
  • Brood War
  • Smash
  • Heroes
  • Counter-Strike
  • Overwatch
  • Liquibet
  • Fantasy StarCraft
  • TLPD
  • StarCraft 2
  • Brood War
  • Blogs
Forum Sidebar
Events/Features
News
Featured News
Rongyi Cup S3 - RO16 Preview3herO wins SC2 All-Star Invitational10SC2 All-Star Invitational: Tournament Preview5RSL Revival - 2025 Season Finals Preview8RSL Season 3 - Playoffs Preview0
Community News
Weekly Cups (Jan 12-18): herO, MaxPax, Solar win0BSL Season 2025 - Full Overview and Conclusion8Weekly Cups (Jan 5-11): Clem wins big offline, Trigger upsets4$21,000 Rongyi Cup Season 3 announced (Jan 22-Feb 7)19Weekly Cups (Dec 29-Jan 4): Protoss rolls, 2v2 returns7
StarCraft 2
General
PhD study /w SC2 - help with a survey! StarCraft 2 not at the Esports World Cup 2026 Oliveira Would Have Returned If EWC Continued Rongyi Cup S3 - RO16 Preview herO wins SC2 All-Star Invitational
Tourneys
OSC Season 13 World Championship $21,000 Rongyi Cup Season 3 announced (Jan 22-Feb 7) $70 Prize Pool Ladder Legends Academy Weekly Open! SC2 All-Star Invitational: Jan 17-18 Sparkling Tuna Cup - Weekly Open Tournament
Strategy
Simple Questions Simple Answers
Custom Maps
[A] Starcraft Sound Mod
External Content
Mutation # 509 Doomsday Report Mutation # 508 Violent Night Mutation # 507 Well Trained Mutation # 506 Warp Zone
Brood War
General
Gypsy to Korea [ASL21] Potential Map Candidates Which foreign pros are considered the best? BW General Discussion BW AKA finder tool
Tourneys
Small VOD Thread 2.0 [Megathread] Daily Proleagues [BSL21] Non-Korean Championship - Starts Jan 10 Azhi's Colosseum - Season 2
Strategy
Current Meta Simple Questions, Simple Answers Soma's 9 hatch build from ASL Game 2 Game Theory for Starcraft
Other Games
General Games
Nintendo Switch Thread Battle Aces/David Kim RTS Megathread Stormgate/Frost Giant Megathread Beyond All Reason Awesome Games Done Quick 2026!
Dota 2
Official 'what is Dota anymore' discussion
League of Legends
Heroes of the Storm
Simple Questions, Simple Answers Heroes of the Storm 2.0
Hearthstone
Deck construction bug Heroes of StarCraft mini-set
TL Mafia
Vanilla Mini Mafia Mafia Game Mode Feedback/Ideas
Community
General
US Politics Mega-thread Canadian Politics Mega-thread Russo-Ukrainian War Thread NASA and the Private Sector Things Aren’t Peaceful in Palestine
Fan Clubs
The herO Fan Club! The IdrA Fan Club
Media & Entertainment
Anime Discussion Thread [Manga] One Piece
Sports
2024 - 2026 Football Thread
World Cup 2022
Tech Support
Computer Build, Upgrade & Buying Resource Thread
TL Community
The Automated Ban List
Blogs
Navigating the Risks and Rew…
TrAiDoS
My 2025 Magic: The Gathering…
DARKING
Life Update and thoughts.
FuDDx
How do archons sleep?
8882
James Bond movies ranking - pa…
Topin
Customize Sidebar...

Website Feedback

Closed Threads



Active: 2345 users

The Big Programming Thread - Page 1003

Forum Index > General Forum
Post a Reply
Prev 1 1001 1002 1003 1004 1005 1032 Next
Thread Rules
1. This is not a "do my homework for me" thread. If you have specific questions, ask, but don't post an assignment or homework problem and expect an exact solution.
2. No recruiting for your cockamamie projects (you won't replace facebook with 3 dudes you found on the internet and $20)
3. If you can't articulate why a language is bad, don't start slinging shit about it. Just remember that nothing is worse than making CSS IE6 compatible.
4. Use [code] tags to format code blocks.
Deleted User 3420
Profile Blog Joined May 2003
24492 Posts
Last Edited: 2019-03-13 22:34:48
March 13 2019 22:34 GMT
#20041
christopher bishop - pattern recognition and machine learning

I believe that this is regarded as one of the best. It's also hard . (at least for me)
Manit0u
Profile Blog Joined August 2004
Poland17614 Posts
March 14 2019 15:15 GMT
#20042
Be Ruby developer.
Get assigned to a Scala project.
Remove some code.
Project now works fine on 2 machines instead of 8 with the same load.
Feel good.
Time is precious. Waste it wisely.
Deleted User 3420
Profile Blog Joined May 2003
24492 Posts
March 14 2019 16:40 GMT
#20043
Manit0u I just realized I never replied to your post about that graph visualizer.

I have used something like that before actually (not that specifically, that one is for looking at cryptocurrency stuff? I don't know about any of that stuff so if what I just said was dumb then forgive me). Anyways it was very, very interesting to see some of the intuition I had about some of the graphs I was working with come to life in a visual way. However, the really hard graphs were difficult to even tell what was going on when their relationships were inspected visually. They mostly come out looking like repeated intertwined rings of varying lengths.
Manit0u
Profile Blog Joined August 2004
Poland17614 Posts
March 14 2019 18:57 GMT
#20044
On March 15 2019 01:40 travis wrote:
Manit0u I just realized I never replied to your post about that graph visualizer.

I have used something like that before actually (not that specifically, that one is for looking at cryptocurrency stuff? I don't know about any of that stuff so if what I just said was dumb then forgive me). Anyways it was very, very interesting to see some of the intuition I had about some of the graphs I was working with come to life in a visual way. However, the really hard graphs were difficult to even tell what was going on when their relationships were inspected visually. They mostly come out looking like repeated intertwined rings of varying lengths.


Well, those visualizations are actually for function calls and network systems. Could potentially work for anything since functions within an application and network systems are basically graphs of sort.
Time is precious. Waste it wisely.
WarSame
Profile Blog Joined February 2010
Canada1950 Posts
March 15 2019 03:19 GMT
#20045
I recently got myself pulled of an old, crappy system program and am now doing some Google Cloud Functions work. I am loving this!
Can it be I stayed away too long? Did you miss these rhymes while I was gone?
Mr. Wiggles
Profile Blog Joined August 2010
Canada5894 Posts
March 15 2019 03:52 GMT
#20046
On March 14 2019 03:34 SC-Shield wrote:
Could you please recommend a few nice books about Machine Learning? If they're about Reinforcement Learning, then it will be even better.

Here's the book I used when I took a course in reinforcement learning from Rich Sutton:

http://incompleteideas.net/book/the-book.html

It's available for free as a PDF. When we did the course he was still working on the second edition, so some stuff was missing, but it looks complete now. It was pretty good from what I remember, and was useful when I was refreshing myself on some concepts recently.

Rich is one of the 'fathers' of reinforcement learning and is currently leading the Deep Mind office in Edmonton, so you can consider him a pretty authoritative source on RL

https://deepmind.com/blog/deepmind-office-canada-edmonton/
you gotta dance
Manit0u
Profile Blog Joined August 2004
Poland17614 Posts
March 15 2019 11:45 GMT
#20047
[image loading]
Time is precious. Waste it wisely.
Manit0u
Profile Blog Joined August 2004
Poland17614 Posts
March 16 2019 11:54 GMT
#20048
Guys, I need your help. My friend wants to switch jobs and I hooked him up with an entry-level front-end position. For this he'll need to be able to do some basic web app in Angular or React.

Unfortunately I haven't touched front-end for 3 years and I wouldn't know what would be some of the best online resources to learn those technologies. Could you hook the brother up?

Any tips or hints on what to pay attention to and what extra skills (besides less or sass) might be required would be greatly appreciated. Should I also teach him some about NoSQL stuff like Mongo?
Time is precious. Waste it wisely.
Deleted User 3420
Profile Blog Joined May 2003
24492 Posts
Last Edited: 2019-03-17 15:16:51
March 17 2019 15:16 GMT
#20049
neural network question for people with deep learning expertise

So, it's my intuition that what a neural network really does is basically find and weight correlations between (n choose k) inputs within the units of the neural network.

Which makes me wonder: for many networks, is relu really a good choice of activation function? Because, if the above is correct, then doesn't relu only find POSITIVE correlations between inputs? For example, if we have inputs A,B,C,D, and are trying to classify a cat, a network using relu can find that A = .5 AND B = .5, may be more significant towards our data being a cat than the individual weightings of when A = .5 + B = .5. However, relu should *not* be able to find a negative correlation, that is to say that maybe A = .5 increases the likelihood of our data being a cat, C = .5 increases the likelihood of our data being a cat, but A =.5 AND C = .5 makes it LESS likely that our data is a cat.

Am I right that relu cannot find that last type of correlation, and you would need something like leaky relu to find that kind of correlation in data?
Equalizer
Profile Joined April 2010
Canada115 Posts
March 17 2019 17:19 GMT
#20050
Requires more than 1 layer; passing the output of ReLu through negative weight edge will allow the second layer to not activate when both A and C are active and activate when either A or C are active.
The person who says it cannot be done, should not interrupt the person doing it.
Deleted User 3420
Profile Blog Joined May 2003
24492 Posts
Last Edited: 2019-03-17 17:30:13
March 17 2019 17:29 GMT
#20051
But in the end, it can never value the correlation as negative, at best it can value it at zero, right? So it can't fully capture the relationship in the most accurate way - your network can't truly punish this negative correlation - it can only "not reward it".


Let's take a network with 26 inputs: A....Z
If 100% of the time that A >= .5, and C >=.5: it is not a cat.
But 100% of the rest of the time A >= .5, or C >=.5: it is a cat

It's not realistically ever going to be able to learn this rule exactly, right? Even if some later unit doesn't fire because it solved {A >= .5, C>=.5} ---> not a cat, this won't prevent it from thinking it is a cat based on the other inputs.


I mean.. I suppose I could see it eventually solving that relationship. But then the amount of layers and units would need to be incredibly huge.

Or am I overcomplicating this and I am flat out wrong in my conceptualization.
Simberto
Profile Blog Joined July 2010
Germany11722 Posts
March 17 2019 17:39 GMT
#20052
Correlations can mathematically rank between -1 and 1.

a correlation of -1 between A and B means that if A, then never B (and if B, then never A)
a correlation of 0 means that A doesn't influence whether B or not B
a correlation of 1 means that if A, then B (and if B, then A)

all other values are in between.

I don't know too much about programming though.
Acrofales
Profile Joined August 2010
Spain18194 Posts
March 17 2019 18:29 GMT
#20053
On March 18 2019 02:29 travis wrote:
But in the end, it can never value the correlation as negative, at best it can value it at zero, right? So it can't fully capture the relationship in the most accurate way - your network can't truly punish this negative correlation - it can only "not reward it".


Let's take a network with 26 inputs: A....Z
If 100% of the time that A >= .5, and C >=.5: it is not a cat.
But 100% of the rest of the time A >= .5, or C >=.5: it is a cat

It's not realistically ever going to be able to learn this rule exactly, right? Even if some later unit doesn't fire because it solved {A >= .5, C>=.5} ---> not a cat, this won't prevent it from thinking it is a cat based on the other inputs.


I mean.. I suppose I could see it eventually solving that relationship. But then the amount of layers and units would need to be incredibly huge.

Or am I overcomplicating this and I am flat out wrong in my conceptualization.

You just need multiple layers as Equalizer stated. I haven't actually read this blog, but the solution appears to be right, so I assume it's on-point for solving XOR with perceptrons:

https://towardsdatascience.com/perceptrons-logical-functions-and-the-xor-problem-37ca5025790a
zatic
Profile Blog Joined September 2007
Zurich15359 Posts
March 17 2019 23:22 GMT
#20054
I have tried to put this in words but my takes at explaining have all been crap haha. But I am still learning ML myself.

travis in the end it comes down to what Equalizer said, a negative weight on the edge to the next layer does the job. And it doesn't need a lot of layers and units. Maybe just try it out? You can build your problem (essentially XOR) with 2 layers, (in and output layer), 2 nodes, and fit it to 1. accuracy. If you print out the weights and biases you will see that one edge is positive and one is negative.

Conceptually, it's not the activation function that does the regression. You can try above with any activation function, including one that can return negative values, and it will turn out the same.
ModeratorI know Teamliquid is known as a massive building
Deleted User 3420
Profile Blog Joined May 2003
24492 Posts
Last Edited: 2019-03-17 23:48:44
March 17 2019 23:42 GMT
#20055
Well, the problem isn't really equivalent to an XOR, right? (I know that we can make an XOR network) In the example I gave it is, but thats because I gave a lazy explanation. What I really just meant was A has a positive effect on an outcome of cat, B has a positive effect on outcome of cat, but (A and B) has a negative effect on outcome of cat. XOR simplifies this, but as the relationships become more and more complicated it would seem that that simplification would require a further and further expanded network, because relu will just kill every unit that investigates (A and B) heavily, rather than ever actually providing a negative value. If we simply provided a negative output for (A and B) from one of our units, then we could capture that information in the very first hidden layer, rather than having to rely on the summations of previous layers which had a 0 output because they investigated (A and B) heavily.

Anyways I will just take everyone's word for it that relu works efficiently for this. I do know that most papers say that evidence points at there not being much advantage to leaky relu other than for addressing the issue of units that will no longer learn.
zatic
Profile Blog Joined September 2007
Zurich15359 Posts
March 17 2019 23:49 GMT
#20056
It took like 10 years for a consensus to build that relu works better than other activations. To quote a lecture I heard on deep learning some years ago: "We use relu because it works better. Why does it work better? We don't know. If someone tells you that they know, they are lying."
ModeratorI know Teamliquid is known as a massive building
Equalizer
Profile Joined April 2010
Canada115 Posts
March 18 2019 03:17 GMT
#20057
@travis I do not see why what I mentioned doesn't accomplish what you describe. By having a negative weight it is equivalent to have a unit producing a negative output.

Consider the following,
output = w_1 * A + w_2 * B + w_3 * max(A + B - 1,0) + bias
where A,B in [0,1]

By setting w_3 as an appropriate negative weight you can apply any negative contribution from the event of A and B to the output that is needed.

Note: max(A + B - 1,0) is just ReLu with weights of 1 and bias of -1.
The person who says it cannot be done, should not interrupt the person doing it.
Deleted User 3420
Profile Blog Joined May 2003
24492 Posts
March 18 2019 23:07 GMT
#20058
You're right, im dumb, you can. Thank you!
Manit0u
Profile Blog Joined August 2004
Poland17614 Posts
March 19 2019 21:03 GMT
#20059
Time is precious. Waste it wisely.
JimmyJRaynor
Profile Blog Joined April 2010
Canada17204 Posts
Last Edited: 2019-03-20 19:24:53
March 20 2019 19:18 GMT
#20060
On February 21 2019 12:21 Manit0u wrote:
Screw math and ML.

to your point...
In general, "Machine Learning" is being oversold by its proponents.

"I apparently have a bit of a reputation as someone who is anti-machine learning or anti-AI when it comes to human research. This is a bit of misrepresentation of my views, and (I'd argue) a misrepresentation of the issues "statistics people" take with AI/ML as a whole."

"I personally think that AI/ML has a lot to bring to the table to enhance science, health and human performance. The problem is that the AI/ML crowd are over-selling their wares and often being disingenuous about what is current state-of-the art"

"Issue 1: CLAIMING EVERYTHING IS MACHINE LEARNING. Just because AI/ML may use algebra or linear regression, doesn't mean it is AI/ML. Same goes for Nonlinear regression, correlation, logistic regression, or everything else that IS STATISTICS (or information theory, etc.) "

"It's cool if you use statistics and statistical concepts properly. Really, we're a big-tent kind of people. Just don't claim you invented something you clearly did not. And no, stringing together multiple correlations in an automated way doesn't make it extra special. "

"Issue 2: OMG THE HYPE MACHINE, MAKE IT STOP. Again, a lot of really good stuff is being trialed with AI/ML. You don't need to oversell the genuinely good work and advances being pioneered. Here's the thing, most "statistics people" are allergic to hype."

"Many human research statisticians work in areas of health where people can die or receive in appropriate treatments if we do our job wrong. It isn't to say we're perfect, but we work hard to be conservative and criticize our models so we're confident in the results."

"This is, I think, the main reason statisticians have issues with the AI/ML crowd: we can smell snake oil. The really good and avant garde AI/ML work gets lumped in with the utter nonsense directed at VC's and the pop media."

"Issue 3: THE SNAKE OIL IS SPREADING (tweet #7) Again, there is good AI/ML work being done, but most of it is just re-branded statistics or 'stuff' hiding behind the term "proprietary". This snake oil is leaking into government, academia, etc in an attempt to be 'cool'"

"We're seeing this salesmanship more-and-more outside of traditional AI/ML technology circles. There are conference presentations or academic papers that call things like principle component analysis AI/ML... it was invented in 1901! https://en.wikipedia.org/wiki/Principal_component_analysis … "

"Finally, many "stats people" are interested in applying AI/ML techniques and seeing where it can compliment our backgrounds and current work. We just get turned off by the bravado, hype and salesmanship that accompanies AI/ML. So yeah, we'll keep giving it a hard time. "

Ray Kassar To David Crane : "you're no more important to Atari than the factory workers assembling the cartridges"
Prev 1 1001 1002 1003 1004 1005 1032 Next
Please log in or register to reply.
Live Events Refresh
Next event in 16h 38m
[ Submit Event ]
Live Streams
Refresh
StarCraft 2
ProTech134
JuggernautJason124
MindelVK 31
gerald23 23
StarCraft: Brood War
Calm 2850
GuemChi 526
Shuttle 435
ggaemo 220
Dewaltoss 186
firebathero 125
Soulkey 103
Hyuk 89
Free 14
910 14
[ Show more ]
Yoon 13
HiyA 9
Dota 2
Gorgc6650
qojqva3115
Counter-Strike
fl0m3359
ceh9417
Other Games
gofns8698
Grubby2269
Beastyqt711
Mlord380
allub230
ArmadaUGS199
Harstem182
Fuzer 182
KnowMe177
DeMusliM121
Liquid`Hasu106
QueenE102
Mew2King58
Livibee53
ToD45
Organizations
Other Games
WardiTV480
StarCraft 2
Blizzard YouTube
StarCraft: Brood War
BSLTrovo
sctven
[ Show 21 non-featured ]
StarCraft 2
• StrangeGG 58
• Kozan
• Laughngamez YouTube
• sooper7s
• AfreecaTV YouTube
• intothetv
• Migwel
• IndyKCrew
• LaughNgamezSOOP
StarCraft: Brood War
• blackmanpl 39
• FirePhoenix4
• Pr0nogo 3
• STPLYoutube
• ZZZeroYoutube
• BSLYoutube
Dota 2
• Noizen42
League of Legends
• Jankos2309
• TFBlade1297
Other Games
• imaqtpie953
• WagamamaTV441
• Shiphtur300
Upcoming Events
RongYI Cup
16h 38m
Clem vs ShoWTimE
Zoun vs Bunny
Big Brain Bouts
22h 38m
Percival vs Gerald
Serral vs MaxPax
RongYI Cup
1d 16h
SHIN vs Creator
Classic vs Percival
OSC
1d 18h
BSL 21
1d 20h
RongYI Cup
2 days
Maru vs Cyan
Solar vs Krystianer
uThermal 2v2 Circuit
2 days
BSL 21
2 days
Wardi Open
3 days
Monday Night Weeklies
3 days
[ Show More ]
OSC
4 days
WardiTV Invitational
4 days
WardiTV Invitational
5 days
The PondCast
6 days
Liquipedia Results

Completed

Proleague 2026-01-20
SC2 All-Star Inv. 2025
NA Kuram Kup

Ongoing

C-Race Season 1
BSL 21 Non-Korean Championship
CSL 2025 WINTER (S19)
KCM Race Survival 2026 Season 1
Rongyi Cup S3
Underdog Cup #3
BLAST Bounty Winter 2026
BLAST Bounty Winter Qual
eXTREMESLAND 2025
SL Budapest Major 2025
ESL Impact League Season 8
BLAST Rivals Fall 2025
IEM Chengdu 2025

Upcoming

Escore Tournament S1: W5
Acropolis #4 - TS4
Acropolis #4
IPSL Spring 2026
uThermal 2v2 2026 Main Event
Bellum Gens Elite Stara Zagora 2026
HSC XXVIII
Nations Cup 2026
PGL Bucharest 2026
Stake Ranked Episode 1
BLAST Open Spring 2026
ESL Pro League Season 23
ESL Pro League Season 23
PGL Cluj-Napoca 2026
IEM Kraków 2026
TLPD

1. ByuN
2. TY
3. Dark
4. Solar
5. Stats
6. Nerchio
7. sOs
8. soO
9. INnoVation
10. Elazer
1. Rain
2. Flash
3. EffOrt
4. Last
5. Bisu
6. Soulkey
7. Mini
8. Sharp
Sidebar Settings...

Advertising | Privacy Policy | Terms Of Use | Contact Us

Original banner artwork: Jim Warren
The contents of this webpage are copyright © 2026 TLnet. All Rights Reserved.