• Log InLog In
  • Register
Liquid`
Team Liquid Liquipedia
EDT 14:32
CEST 20:32
KST 03:32
  • Home
  • Forum
  • Calendar
  • Streams
  • Liquipedia
  • Features
  • Store
  • EPT
  • TL+
  • StarCraft 2
  • Brood War
  • Smash
  • Heroes
  • Counter-Strike
  • Overwatch
  • Liquibet
  • Fantasy StarCraft
  • TLPD
  • StarCraft 2
  • Brood War
  • Blogs
Forum Sidebar
Events/Features
News
Featured News
[ASL21] Ro8 Preview Pt2: Progenitors8Code S Season 1 - RO12 Group A: Rogue, Percival, Solar, Zoun13[ASL21] Ro8 Preview Pt1: Inheritors16[ASL21] Ro16 Preview Pt2: All Star10Team Liquid Map Contest #22 - The Finalists22
Community News
Weekly Cups (April 27-May 4): Clem takes triple0RSL Revival: Season 5 - Qualifiers and Main Event11Code S Season 1 (2026) - RO12 Results12026 GSL Season 1 Qualifiers25Maestros of the Game 2 announced9
StarCraft 2
General
Weekly Cups (April 27-May 4): Clem takes triple Blizzard Classic Cup @ BlizzCon 2026 - $100k prize pool Code S Season 1 (2026) - RO12 Results Code S Season 1 - RO12 Group A: Rogue, Percival, Solar, Zoun Team Liquid Map Contest #22 - The Finalists
Tourneys
Sparkling Tuna Cup - Weekly Open Tournament RSL Revival: Season 5 - Qualifiers and Main Event StarCraft Evolution League (SC Evo Biweekly) 2026 GSL Season 2 Qualifiers $1,400 SEL Season 3 Ladder Invitational
Strategy
Custom Maps
[D]RTS in all its shapes and glory <3 [A] Nemrods 1/4 players [M] (2) Frigid Storage
External Content
Mutation # 524 Death and Taxes The PondCast: SC2 News & Results Mutation # 523 Firewall Mutation # 522 Flip My Base
Brood War
General
AI Question ASL21 General Discussion Using AI to optimize marketing campaigns [ASL21] Ro8 Preview Pt2: Progenitors Why there arent any 256x256 pro maps?
Tourneys
[ASL21] Ro8 Day 4 [ASL21] Ro8 Day 3 [Megathread] Daily Proleagues [ASL21] Ro8 Day 2
Strategy
Simple Questions, Simple Answers Fighting Spirit mining rates What's the deal with APM & what's its true value Any training maps people recommend?
Other Games
General Games
Stormgate/Frost Giant Megathread Dawn of War IV OutLive 25 (RTS Game) Daigo vs Menard Best of 10 Nintendo Switch Thread
Dota 2
The Story of Wings Gaming
League of Legends
G2 just beat GenG in First stand
Heroes of the Storm
Simple Questions, Simple Answers Heroes of the Storm 2.0
Hearthstone
Deck construction bug Heroes of StarCraft mini-set
TL Mafia
Vanilla Mini Mafia Mafia Game Mode Feedback/Ideas TL Mafia Community Thread Five o'clock TL Mafia
Community
General
US Politics Mega-thread European Politico-economics QA Mega-thread Russo-Ukrainian War Thread 3D technology/software discussion Canadian Politics Mega-thread
Fan Clubs
The IdrA Fan Club
Media & Entertainment
Anime Discussion Thread [Manga] One Piece [Req][Books] Good Fantasy/SciFi books
Sports
2024 - 2026 Football Thread Formula 1 Discussion McBoner: A hockey love story
World Cup 2022
Tech Support
streaming software Strange computer issues (software) [G] How to Block Livestream Ads
TL Community
The Automated Ban List
Blogs
Movie Stars In Video Games: …
TrAiDoS
ramps on octagon
StaticNine
Broowar part 2
qwaykee
Funny Nicknames
LUCKY_NOOB
Customize Sidebar...

Website Feedback

Closed Threads



Active: 1193 users

The Big Programming Thread - Page 1003

Forum Index > General Forum
Post a Reply
Prev 1 1001 1002 1003 1004 1005 1032 Next
Thread Rules
1. This is not a "do my homework for me" thread. If you have specific questions, ask, but don't post an assignment or homework problem and expect an exact solution.
2. No recruiting for your cockamamie projects (you won't replace facebook with 3 dudes you found on the internet and $20)
3. If you can't articulate why a language is bad, don't start slinging shit about it. Just remember that nothing is worse than making CSS IE6 compatible.
4. Use [code] tags to format code blocks.
Deleted User 3420
Profile Blog Joined May 2003
24492 Posts
Last Edited: 2019-03-13 22:34:48
March 13 2019 22:34 GMT
#20041
christopher bishop - pattern recognition and machine learning

I believe that this is regarded as one of the best. It's also hard . (at least for me)
Manit0u
Profile Blog Joined August 2004
Poland17743 Posts
March 14 2019 15:15 GMT
#20042
Be Ruby developer.
Get assigned to a Scala project.
Remove some code.
Project now works fine on 2 machines instead of 8 with the same load.
Feel good.
Time is precious. Waste it wisely.
Deleted User 3420
Profile Blog Joined May 2003
24492 Posts
March 14 2019 16:40 GMT
#20043
Manit0u I just realized I never replied to your post about that graph visualizer.

I have used something like that before actually (not that specifically, that one is for looking at cryptocurrency stuff? I don't know about any of that stuff so if what I just said was dumb then forgive me). Anyways it was very, very interesting to see some of the intuition I had about some of the graphs I was working with come to life in a visual way. However, the really hard graphs were difficult to even tell what was going on when their relationships were inspected visually. They mostly come out looking like repeated intertwined rings of varying lengths.
Manit0u
Profile Blog Joined August 2004
Poland17743 Posts
March 14 2019 18:57 GMT
#20044
On March 15 2019 01:40 travis wrote:
Manit0u I just realized I never replied to your post about that graph visualizer.

I have used something like that before actually (not that specifically, that one is for looking at cryptocurrency stuff? I don't know about any of that stuff so if what I just said was dumb then forgive me). Anyways it was very, very interesting to see some of the intuition I had about some of the graphs I was working with come to life in a visual way. However, the really hard graphs were difficult to even tell what was going on when their relationships were inspected visually. They mostly come out looking like repeated intertwined rings of varying lengths.


Well, those visualizations are actually for function calls and network systems. Could potentially work for anything since functions within an application and network systems are basically graphs of sort.
Time is precious. Waste it wisely.
WarSame
Profile Blog Joined February 2010
Canada1950 Posts
March 15 2019 03:19 GMT
#20045
I recently got myself pulled of an old, crappy system program and am now doing some Google Cloud Functions work. I am loving this!
Can it be I stayed away too long? Did you miss these rhymes while I was gone?
Mr. Wiggles
Profile Blog Joined August 2010
Canada5894 Posts
March 15 2019 03:52 GMT
#20046
On March 14 2019 03:34 SC-Shield wrote:
Could you please recommend a few nice books about Machine Learning? If they're about Reinforcement Learning, then it will be even better.

Here's the book I used when I took a course in reinforcement learning from Rich Sutton:

http://incompleteideas.net/book/the-book.html

It's available for free as a PDF. When we did the course he was still working on the second edition, so some stuff was missing, but it looks complete now. It was pretty good from what I remember, and was useful when I was refreshing myself on some concepts recently.

Rich is one of the 'fathers' of reinforcement learning and is currently leading the Deep Mind office in Edmonton, so you can consider him a pretty authoritative source on RL

https://deepmind.com/blog/deepmind-office-canada-edmonton/
you gotta dance
Manit0u
Profile Blog Joined August 2004
Poland17743 Posts
March 15 2019 11:45 GMT
#20047
[image loading]
Time is precious. Waste it wisely.
Manit0u
Profile Blog Joined August 2004
Poland17743 Posts
March 16 2019 11:54 GMT
#20048
Guys, I need your help. My friend wants to switch jobs and I hooked him up with an entry-level front-end position. For this he'll need to be able to do some basic web app in Angular or React.

Unfortunately I haven't touched front-end for 3 years and I wouldn't know what would be some of the best online resources to learn those technologies. Could you hook the brother up?

Any tips or hints on what to pay attention to and what extra skills (besides less or sass) might be required would be greatly appreciated. Should I also teach him some about NoSQL stuff like Mongo?
Time is precious. Waste it wisely.
Deleted User 3420
Profile Blog Joined May 2003
24492 Posts
Last Edited: 2019-03-17 15:16:51
March 17 2019 15:16 GMT
#20049
neural network question for people with deep learning expertise

So, it's my intuition that what a neural network really does is basically find and weight correlations between (n choose k) inputs within the units of the neural network.

Which makes me wonder: for many networks, is relu really a good choice of activation function? Because, if the above is correct, then doesn't relu only find POSITIVE correlations between inputs? For example, if we have inputs A,B,C,D, and are trying to classify a cat, a network using relu can find that A = .5 AND B = .5, may be more significant towards our data being a cat than the individual weightings of when A = .5 + B = .5. However, relu should *not* be able to find a negative correlation, that is to say that maybe A = .5 increases the likelihood of our data being a cat, C = .5 increases the likelihood of our data being a cat, but A =.5 AND C = .5 makes it LESS likely that our data is a cat.

Am I right that relu cannot find that last type of correlation, and you would need something like leaky relu to find that kind of correlation in data?
Equalizer
Profile Joined April 2010
Canada115 Posts
March 17 2019 17:19 GMT
#20050
Requires more than 1 layer; passing the output of ReLu through negative weight edge will allow the second layer to not activate when both A and C are active and activate when either A or C are active.
The person who says it cannot be done, should not interrupt the person doing it.
Deleted User 3420
Profile Blog Joined May 2003
24492 Posts
Last Edited: 2019-03-17 17:30:13
March 17 2019 17:29 GMT
#20051
But in the end, it can never value the correlation as negative, at best it can value it at zero, right? So it can't fully capture the relationship in the most accurate way - your network can't truly punish this negative correlation - it can only "not reward it".


Let's take a network with 26 inputs: A....Z
If 100% of the time that A >= .5, and C >=.5: it is not a cat.
But 100% of the rest of the time A >= .5, or C >=.5: it is a cat

It's not realistically ever going to be able to learn this rule exactly, right? Even if some later unit doesn't fire because it solved {A >= .5, C>=.5} ---> not a cat, this won't prevent it from thinking it is a cat based on the other inputs.


I mean.. I suppose I could see it eventually solving that relationship. But then the amount of layers and units would need to be incredibly huge.

Or am I overcomplicating this and I am flat out wrong in my conceptualization.
Simberto
Profile Blog Joined July 2010
Germany11826 Posts
March 17 2019 17:39 GMT
#20052
Correlations can mathematically rank between -1 and 1.

a correlation of -1 between A and B means that if A, then never B (and if B, then never A)
a correlation of 0 means that A doesn't influence whether B or not B
a correlation of 1 means that if A, then B (and if B, then A)

all other values are in between.

I don't know too much about programming though.
Acrofales
Profile Joined August 2010
Spain18286 Posts
March 17 2019 18:29 GMT
#20053
On March 18 2019 02:29 travis wrote:
But in the end, it can never value the correlation as negative, at best it can value it at zero, right? So it can't fully capture the relationship in the most accurate way - your network can't truly punish this negative correlation - it can only "not reward it".


Let's take a network with 26 inputs: A....Z
If 100% of the time that A >= .5, and C >=.5: it is not a cat.
But 100% of the rest of the time A >= .5, or C >=.5: it is a cat

It's not realistically ever going to be able to learn this rule exactly, right? Even if some later unit doesn't fire because it solved {A >= .5, C>=.5} ---> not a cat, this won't prevent it from thinking it is a cat based on the other inputs.


I mean.. I suppose I could see it eventually solving that relationship. But then the amount of layers and units would need to be incredibly huge.

Or am I overcomplicating this and I am flat out wrong in my conceptualization.

You just need multiple layers as Equalizer stated. I haven't actually read this blog, but the solution appears to be right, so I assume it's on-point for solving XOR with perceptrons:

https://towardsdatascience.com/perceptrons-logical-functions-and-the-xor-problem-37ca5025790a
zatic
Profile Blog Joined September 2007
Zurich15365 Posts
March 17 2019 23:22 GMT
#20054
I have tried to put this in words but my takes at explaining have all been crap haha. But I am still learning ML myself.

travis in the end it comes down to what Equalizer said, a negative weight on the edge to the next layer does the job. And it doesn't need a lot of layers and units. Maybe just try it out? You can build your problem (essentially XOR) with 2 layers, (in and output layer), 2 nodes, and fit it to 1. accuracy. If you print out the weights and biases you will see that one edge is positive and one is negative.

Conceptually, it's not the activation function that does the regression. You can try above with any activation function, including one that can return negative values, and it will turn out the same.
ModeratorI know Teamliquid is known as a massive building
Deleted User 3420
Profile Blog Joined May 2003
24492 Posts
Last Edited: 2019-03-17 23:48:44
March 17 2019 23:42 GMT
#20055
Well, the problem isn't really equivalent to an XOR, right? (I know that we can make an XOR network) In the example I gave it is, but thats because I gave a lazy explanation. What I really just meant was A has a positive effect on an outcome of cat, B has a positive effect on outcome of cat, but (A and B) has a negative effect on outcome of cat. XOR simplifies this, but as the relationships become more and more complicated it would seem that that simplification would require a further and further expanded network, because relu will just kill every unit that investigates (A and B) heavily, rather than ever actually providing a negative value. If we simply provided a negative output for (A and B) from one of our units, then we could capture that information in the very first hidden layer, rather than having to rely on the summations of previous layers which had a 0 output because they investigated (A and B) heavily.

Anyways I will just take everyone's word for it that relu works efficiently for this. I do know that most papers say that evidence points at there not being much advantage to leaky relu other than for addressing the issue of units that will no longer learn.
zatic
Profile Blog Joined September 2007
Zurich15365 Posts
March 17 2019 23:49 GMT
#20056
It took like 10 years for a consensus to build that relu works better than other activations. To quote a lecture I heard on deep learning some years ago: "We use relu because it works better. Why does it work better? We don't know. If someone tells you that they know, they are lying."
ModeratorI know Teamliquid is known as a massive building
Equalizer
Profile Joined April 2010
Canada115 Posts
March 18 2019 03:17 GMT
#20057
@travis I do not see why what I mentioned doesn't accomplish what you describe. By having a negative weight it is equivalent to have a unit producing a negative output.

Consider the following,
output = w_1 * A + w_2 * B + w_3 * max(A + B - 1,0) + bias
where A,B in [0,1]

By setting w_3 as an appropriate negative weight you can apply any negative contribution from the event of A and B to the output that is needed.

Note: max(A + B - 1,0) is just ReLu with weights of 1 and bias of -1.
The person who says it cannot be done, should not interrupt the person doing it.
Deleted User 3420
Profile Blog Joined May 2003
24492 Posts
March 18 2019 23:07 GMT
#20058
You're right, im dumb, you can. Thank you!
Manit0u
Profile Blog Joined August 2004
Poland17743 Posts
March 19 2019 21:03 GMT
#20059
Time is precious. Waste it wisely.
JimmyJRaynor
Profile Blog Joined April 2010
Canada17491 Posts
Last Edited: 2019-03-20 19:24:53
March 20 2019 19:18 GMT
#20060
On February 21 2019 12:21 Manit0u wrote:
Screw math and ML.

to your point...
In general, "Machine Learning" is being oversold by its proponents.

"I apparently have a bit of a reputation as someone who is anti-machine learning or anti-AI when it comes to human research. This is a bit of misrepresentation of my views, and (I'd argue) a misrepresentation of the issues "statistics people" take with AI/ML as a whole."

"I personally think that AI/ML has a lot to bring to the table to enhance science, health and human performance. The problem is that the AI/ML crowd are over-selling their wares and often being disingenuous about what is current state-of-the art"

"Issue 1: CLAIMING EVERYTHING IS MACHINE LEARNING. Just because AI/ML may use algebra or linear regression, doesn't mean it is AI/ML. Same goes for Nonlinear regression, correlation, logistic regression, or everything else that IS STATISTICS (or information theory, etc.) "

"It's cool if you use statistics and statistical concepts properly. Really, we're a big-tent kind of people. Just don't claim you invented something you clearly did not. And no, stringing together multiple correlations in an automated way doesn't make it extra special. "

"Issue 2: OMG THE HYPE MACHINE, MAKE IT STOP. Again, a lot of really good stuff is being trialed with AI/ML. You don't need to oversell the genuinely good work and advances being pioneered. Here's the thing, most "statistics people" are allergic to hype."

"Many human research statisticians work in areas of health where people can die or receive in appropriate treatments if we do our job wrong. It isn't to say we're perfect, but we work hard to be conservative and criticize our models so we're confident in the results."

"This is, I think, the main reason statisticians have issues with the AI/ML crowd: we can smell snake oil. The really good and avant garde AI/ML work gets lumped in with the utter nonsense directed at VC's and the pop media."

"Issue 3: THE SNAKE OIL IS SPREADING (tweet #7) Again, there is good AI/ML work being done, but most of it is just re-branded statistics or 'stuff' hiding behind the term "proprietary". This snake oil is leaking into government, academia, etc in an attempt to be 'cool'"

"We're seeing this salesmanship more-and-more outside of traditional AI/ML technology circles. There are conference presentations or academic papers that call things like principle component analysis AI/ML... it was invented in 1901! https://en.wikipedia.org/wiki/Principal_component_analysis … "

"Finally, many "stats people" are interested in applying AI/ML techniques and seeing where it can compliment our backgrounds and current work. We just get turned off by the bravado, hype and salesmanship that accompanies AI/ML. So yeah, we'll keep giving it a hard time. "

Ray Kassar To David Crane : "you're no more important to Atari than the factory workers assembling the cartridges"
Prev 1 1001 1002 1003 1004 1005 1032 Next
Please log in or register to reply.
Live Events Refresh
Next event in 5h 28m
[ Submit Event ]
Live Streams
Refresh
StarCraft 2
mouzHeroMarine 427
UpATreeSC 96
BRAT_OK 68
MindelVK 39
JuggernautJason25
StarCraft: Brood War
Britney 27786
Calm 3985
ggaemo 289
Soma 253
Dewaltoss 138
Aegong 31
Backho 25
sSak 19
Hm[arnc] 18
IntoTheRainbow 13
[ Show more ]
Movie 13
ajuk12(nOOB) 11
Dota 2
XaKoH 399
monkeys_forever226
Counter-Strike
pashabiceps2942
fl0m2269
byalli273
Heroes of the Storm
Liquid`Hasu329
Other Games
Grubby4680
B2W.Neo1859
Liquid`RaSZi1430
FrodaN992
Beastyqt777
qojqva694
C9.Mang0195
DeMusliM182
ArmadaUGS152
KnowMe145
QueenE108
Hui .90
Mew2King72
elazer62
Trikslyr48
Organizations
Other Games
BasetradeTV382
Dota 2
PGL Dota 2 - Main Stream35
StarCraft 2
angryscii 14
Blizzard YouTube
StarCraft: Brood War
BSLTrovo
[ Show 19 non-featured ]
StarCraft 2
• Reevou 8
• Dystopia_ 3
• Migwel
• AfreecaTV YouTube
• sooper7s
• intothetv
• Kozan
• IndyKCrew
• LaughNgamezSOOP
StarCraft: Brood War
• HerbMon 23
• Azhi_Dahaki23
• 80smullet 12
• Michael_bg 4
• STPLYoutube
• ZZZeroYoutube
• BSLYoutube
League of Legends
• imaqtpie1605
Other Games
• WagamamaTV445
• Shiphtur277
Upcoming Events
PiGosaur Cup
5h 28m
GSL
14h 58m
Classic vs Cure
Maru vs Rogue
GSL
1d 14h
SHIN vs Zoun
ByuN vs herO
OSC
1d 16h
OSC
1d 18h
Replay Cast
2 days
Escore
2 days
The PondCast
2 days
WardiTV Invitational
2 days
Zoun vs Ryung
Lambo vs ShoWTimE
OSC
3 days
[ Show More ]
Replay Cast
3 days
CranKy Ducklings
3 days
RSL Revival
3 days
SHIN vs Bunny
ByuN vs Shameless
WardiTV Invitational
3 days
Krystianer vs TriGGeR
Cure vs Rogue
uThermal 2v2 Circuit
3 days
BSL
4 days
Replay Cast
4 days
Sparkling Tuna Cup
4 days
RSL Revival
4 days
Cure vs Zoun
Clem vs Lambo
WardiTV Invitational
4 days
BSL
5 days
GSL
5 days
Afreeca Starleague
5 days
Monday Night Weeklies
5 days
Afreeca Starleague
6 days
CranKy Ducklings
6 days
Liquipedia Results

Completed

Proleague 2026-05-02
WardiTV TLMC #16
Nations Cup 2026

Ongoing

BSL Season 22
ASL Season 21
CSL 2026 SPRING (S20)
IPSL Spring 2026
KCM Race Survival 2026 Season 2
Acropolis #4
SCTL 2026 Spring
RSL Revival: Season 5
2026 GSL S1
BLAST Rivals Spring 2026
IEM Rio 2026
PGL Bucharest 2026
Stake Ranked Episode 1
BLAST Open Spring 2026
ESL Pro League S23 Finals
ESL Pro League S23 Stage 1&2
PGL Cluj-Napoca 2026

Upcoming

YSL S3
Escore Tournament S2: W6
KK 2v2 League Season 1
BSL 22 Non-Korean Championship
Escore Tournament S2: W7
Escore Tournament S2: W8
CSLAN 4
Kung Fu Cup 2026 Grand Finals
HSC XXIX
uThermal 2v2 2026 Main Event
Maestros of the Game 2
2026 GSL S2
Stake Ranked Episode 3
XSE Pro League 2026
IEM Cologne Major 2026
Stake Ranked Episode 2
CS Asia Championships 2026
IEM Atlanta 2026
Asian Champions League 2026
PGL Astana 2026
TLPD

1. ByuN
2. TY
3. Dark
4. Solar
5. Stats
6. Nerchio
7. sOs
8. soO
9. INnoVation
10. Elazer
1. Rain
2. Flash
3. EffOrt
4. Last
5. Bisu
6. Soulkey
7. Mini
8. Sharp
Sidebar Settings...

Advertising | Privacy Policy | Terms Of Use | Contact Us

Original banner artwork: Jim Warren
The contents of this webpage are copyright © 2026 TLnet. All Rights Reserved.