• Log InLog In
  • Register
Liquid`
Team Liquid Liquipedia
EDT 09:45
CET 14:45
KST 22:45
  • Home
  • Forum
  • Calendar
  • Streams
  • Liquipedia
  • Features
  • Store
  • EPT
  • TL+
  • StarCraft 2
  • Brood War
  • Smash
  • Heroes
  • Counter-Strike
  • Overwatch
  • Liquibet
  • Fantasy StarCraft
  • TLPD
  • StarCraft 2
  • Brood War
  • Blogs
Forum Sidebar
Events/Features
News
Featured News
ByuL: The Forgotten Master of ZvT30Behind the Blue - Team Liquid History Book19Clem wins HomeStory Cup 289HomeStory Cup 28 - Info & Preview13Rongyi Cup S3 - Preview & Info8
Community News
2026 KongFu Cup Announcement3BGE Stara Zagora 2026 cancelled12Blizzard Classic Cup - Tastosis announced as captains15Weekly Cups (March 2-8): ByuN overcomes PvT block4GSL CK - New online series18
StarCraft 2
General
BGE Stara Zagora 2026 cancelled Blizzard Classic Cup - Tastosis announced as captains BGE Stara Zagora 2026 announced ByuL: The Forgotten Master of ZvT Terran AddOns placement
Tourneys
RSL Season 4 announced for March-April PIG STY FESTIVAL 7.0! (19 Feb - 1 Mar) Sparkling Tuna Cup - Weekly Open Tournament 2026 KongFu Cup Announcement [GSL CK] Team Maru vs. Team herO
Strategy
Custom Maps
Publishing has been re-enabled! [Feb 24th 2026] Map Editor closed ?
External Content
The PondCast: SC2 News & Results Mutation # 516 Specter of Death Mutation # 515 Together Forever Mutation # 514 Ulnar New Year
Brood War
General
BGH Auto Balance -> http://bghmmr.eu/ BSL 22 Map Contest — Submissions OPEN to March 10 ASL21 General Discussion Are you ready for ASL 21? Hype VIDEO Gypsy to Korea
Tourneys
[Megathread] Daily Proleagues [BSL22] Open Qualifiers & Ladder Tours IPSL Spring 2026 is here! ASL Season 21 Qualifiers March 7-8
Strategy
Simple Questions, Simple Answers Soma's 9 hatch build from ASL Game 2 Fighting Spirit mining rates Zealot bombing is no longer popular?
Other Games
General Games
Stormgate/Frost Giant Megathread Path of Exile Nintendo Switch Thread PC Games Sales Thread No Man's Sky (PS4 and PC)
Dota 2
Official 'what is Dota anymore' discussion The Story of Wings Gaming
League of Legends
Heroes of the Storm
Simple Questions, Simple Answers Heroes of the Storm 2.0
Hearthstone
Deck construction bug Heroes of StarCraft mini-set
TL Mafia
Five o'clock TL Mafia Mafia Game Mode Feedback/Ideas Vanilla Mini Mafia TL Mafia Community Thread
Community
General
US Politics Mega-thread Mexico's Drug War Russo-Ukrainian War Thread Things Aren’t Peaceful in Palestine NASA and the Private Sector
Fan Clubs
The IdrA Fan Club
Media & Entertainment
[Manga] One Piece Movie Discussion! [Req][Books] Good Fantasy/SciFi books
Sports
Formula 1 Discussion 2024 - 2026 Football Thread General nutrition recommendations Cricket [SPORT] TL MMA Pick'em Pool 2013
World Cup 2022
Tech Support
Laptop capable of using Photoshop Lightroom?
TL Community
The Automated Ban List
Blogs
Funny Nicknames
LUCKY_NOOB
Money Laundering In Video Ga…
TrAiDoS
Iranian anarchists: organize…
XenOsky
FS++
Kraekkling
Shocked by a laser…
Spydermine0240
Unintentional protectionism…
Uldridge
ASL S21 English Commentary…
namkraft
Customize Sidebar...

Website Feedback

Closed Threads



Active: 3150 users

The Big Programming Thread - Page 1003

Forum Index > General Forum
Post a Reply
Prev 1 1001 1002 1003 1004 1005 1032 Next
Thread Rules
1. This is not a "do my homework for me" thread. If you have specific questions, ask, but don't post an assignment or homework problem and expect an exact solution.
2. No recruiting for your cockamamie projects (you won't replace facebook with 3 dudes you found on the internet and $20)
3. If you can't articulate why a language is bad, don't start slinging shit about it. Just remember that nothing is worse than making CSS IE6 compatible.
4. Use [code] tags to format code blocks.
Deleted User 3420
Profile Blog Joined May 2003
24492 Posts
Last Edited: 2019-03-13 22:34:48
March 13 2019 22:34 GMT
#20041
christopher bishop - pattern recognition and machine learning

I believe that this is regarded as one of the best. It's also hard . (at least for me)
Manit0u
Profile Blog Joined August 2004
Poland17693 Posts
March 14 2019 15:15 GMT
#20042
Be Ruby developer.
Get assigned to a Scala project.
Remove some code.
Project now works fine on 2 machines instead of 8 with the same load.
Feel good.
Time is precious. Waste it wisely.
Deleted User 3420
Profile Blog Joined May 2003
24492 Posts
March 14 2019 16:40 GMT
#20043
Manit0u I just realized I never replied to your post about that graph visualizer.

I have used something like that before actually (not that specifically, that one is for looking at cryptocurrency stuff? I don't know about any of that stuff so if what I just said was dumb then forgive me). Anyways it was very, very interesting to see some of the intuition I had about some of the graphs I was working with come to life in a visual way. However, the really hard graphs were difficult to even tell what was going on when their relationships were inspected visually. They mostly come out looking like repeated intertwined rings of varying lengths.
Manit0u
Profile Blog Joined August 2004
Poland17693 Posts
March 14 2019 18:57 GMT
#20044
On March 15 2019 01:40 travis wrote:
Manit0u I just realized I never replied to your post about that graph visualizer.

I have used something like that before actually (not that specifically, that one is for looking at cryptocurrency stuff? I don't know about any of that stuff so if what I just said was dumb then forgive me). Anyways it was very, very interesting to see some of the intuition I had about some of the graphs I was working with come to life in a visual way. However, the really hard graphs were difficult to even tell what was going on when their relationships were inspected visually. They mostly come out looking like repeated intertwined rings of varying lengths.


Well, those visualizations are actually for function calls and network systems. Could potentially work for anything since functions within an application and network systems are basically graphs of sort.
Time is precious. Waste it wisely.
WarSame
Profile Blog Joined February 2010
Canada1950 Posts
March 15 2019 03:19 GMT
#20045
I recently got myself pulled of an old, crappy system program and am now doing some Google Cloud Functions work. I am loving this!
Can it be I stayed away too long? Did you miss these rhymes while I was gone?
Mr. Wiggles
Profile Blog Joined August 2010
Canada5894 Posts
March 15 2019 03:52 GMT
#20046
On March 14 2019 03:34 SC-Shield wrote:
Could you please recommend a few nice books about Machine Learning? If they're about Reinforcement Learning, then it will be even better.

Here's the book I used when I took a course in reinforcement learning from Rich Sutton:

http://incompleteideas.net/book/the-book.html

It's available for free as a PDF. When we did the course he was still working on the second edition, so some stuff was missing, but it looks complete now. It was pretty good from what I remember, and was useful when I was refreshing myself on some concepts recently.

Rich is one of the 'fathers' of reinforcement learning and is currently leading the Deep Mind office in Edmonton, so you can consider him a pretty authoritative source on RL

https://deepmind.com/blog/deepmind-office-canada-edmonton/
you gotta dance
Manit0u
Profile Blog Joined August 2004
Poland17693 Posts
March 15 2019 11:45 GMT
#20047
[image loading]
Time is precious. Waste it wisely.
Manit0u
Profile Blog Joined August 2004
Poland17693 Posts
March 16 2019 11:54 GMT
#20048
Guys, I need your help. My friend wants to switch jobs and I hooked him up with an entry-level front-end position. For this he'll need to be able to do some basic web app in Angular or React.

Unfortunately I haven't touched front-end for 3 years and I wouldn't know what would be some of the best online resources to learn those technologies. Could you hook the brother up?

Any tips or hints on what to pay attention to and what extra skills (besides less or sass) might be required would be greatly appreciated. Should I also teach him some about NoSQL stuff like Mongo?
Time is precious. Waste it wisely.
Deleted User 3420
Profile Blog Joined May 2003
24492 Posts
Last Edited: 2019-03-17 15:16:51
March 17 2019 15:16 GMT
#20049
neural network question for people with deep learning expertise

So, it's my intuition that what a neural network really does is basically find and weight correlations between (n choose k) inputs within the units of the neural network.

Which makes me wonder: for many networks, is relu really a good choice of activation function? Because, if the above is correct, then doesn't relu only find POSITIVE correlations between inputs? For example, if we have inputs A,B,C,D, and are trying to classify a cat, a network using relu can find that A = .5 AND B = .5, may be more significant towards our data being a cat than the individual weightings of when A = .5 + B = .5. However, relu should *not* be able to find a negative correlation, that is to say that maybe A = .5 increases the likelihood of our data being a cat, C = .5 increases the likelihood of our data being a cat, but A =.5 AND C = .5 makes it LESS likely that our data is a cat.

Am I right that relu cannot find that last type of correlation, and you would need something like leaky relu to find that kind of correlation in data?
Equalizer
Profile Joined April 2010
Canada115 Posts
March 17 2019 17:19 GMT
#20050
Requires more than 1 layer; passing the output of ReLu through negative weight edge will allow the second layer to not activate when both A and C are active and activate when either A or C are active.
The person who says it cannot be done, should not interrupt the person doing it.
Deleted User 3420
Profile Blog Joined May 2003
24492 Posts
Last Edited: 2019-03-17 17:30:13
March 17 2019 17:29 GMT
#20051
But in the end, it can never value the correlation as negative, at best it can value it at zero, right? So it can't fully capture the relationship in the most accurate way - your network can't truly punish this negative correlation - it can only "not reward it".


Let's take a network with 26 inputs: A....Z
If 100% of the time that A >= .5, and C >=.5: it is not a cat.
But 100% of the rest of the time A >= .5, or C >=.5: it is a cat

It's not realistically ever going to be able to learn this rule exactly, right? Even if some later unit doesn't fire because it solved {A >= .5, C>=.5} ---> not a cat, this won't prevent it from thinking it is a cat based on the other inputs.


I mean.. I suppose I could see it eventually solving that relationship. But then the amount of layers and units would need to be incredibly huge.

Or am I overcomplicating this and I am flat out wrong in my conceptualization.
Simberto
Profile Blog Joined July 2010
Germany11774 Posts
March 17 2019 17:39 GMT
#20052
Correlations can mathematically rank between -1 and 1.

a correlation of -1 between A and B means that if A, then never B (and if B, then never A)
a correlation of 0 means that A doesn't influence whether B or not B
a correlation of 1 means that if A, then B (and if B, then A)

all other values are in between.

I don't know too much about programming though.
Acrofales
Profile Joined August 2010
Spain18234 Posts
March 17 2019 18:29 GMT
#20053
On March 18 2019 02:29 travis wrote:
But in the end, it can never value the correlation as negative, at best it can value it at zero, right? So it can't fully capture the relationship in the most accurate way - your network can't truly punish this negative correlation - it can only "not reward it".


Let's take a network with 26 inputs: A....Z
If 100% of the time that A >= .5, and C >=.5: it is not a cat.
But 100% of the rest of the time A >= .5, or C >=.5: it is a cat

It's not realistically ever going to be able to learn this rule exactly, right? Even if some later unit doesn't fire because it solved {A >= .5, C>=.5} ---> not a cat, this won't prevent it from thinking it is a cat based on the other inputs.


I mean.. I suppose I could see it eventually solving that relationship. But then the amount of layers and units would need to be incredibly huge.

Or am I overcomplicating this and I am flat out wrong in my conceptualization.

You just need multiple layers as Equalizer stated. I haven't actually read this blog, but the solution appears to be right, so I assume it's on-point for solving XOR with perceptrons:

https://towardsdatascience.com/perceptrons-logical-functions-and-the-xor-problem-37ca5025790a
zatic
Profile Blog Joined September 2007
Zurich15363 Posts
March 17 2019 23:22 GMT
#20054
I have tried to put this in words but my takes at explaining have all been crap haha. But I am still learning ML myself.

travis in the end it comes down to what Equalizer said, a negative weight on the edge to the next layer does the job. And it doesn't need a lot of layers and units. Maybe just try it out? You can build your problem (essentially XOR) with 2 layers, (in and output layer), 2 nodes, and fit it to 1. accuracy. If you print out the weights and biases you will see that one edge is positive and one is negative.

Conceptually, it's not the activation function that does the regression. You can try above with any activation function, including one that can return negative values, and it will turn out the same.
ModeratorI know Teamliquid is known as a massive building
Deleted User 3420
Profile Blog Joined May 2003
24492 Posts
Last Edited: 2019-03-17 23:48:44
March 17 2019 23:42 GMT
#20055
Well, the problem isn't really equivalent to an XOR, right? (I know that we can make an XOR network) In the example I gave it is, but thats because I gave a lazy explanation. What I really just meant was A has a positive effect on an outcome of cat, B has a positive effect on outcome of cat, but (A and B) has a negative effect on outcome of cat. XOR simplifies this, but as the relationships become more and more complicated it would seem that that simplification would require a further and further expanded network, because relu will just kill every unit that investigates (A and B) heavily, rather than ever actually providing a negative value. If we simply provided a negative output for (A and B) from one of our units, then we could capture that information in the very first hidden layer, rather than having to rely on the summations of previous layers which had a 0 output because they investigated (A and B) heavily.

Anyways I will just take everyone's word for it that relu works efficiently for this. I do know that most papers say that evidence points at there not being much advantage to leaky relu other than for addressing the issue of units that will no longer learn.
zatic
Profile Blog Joined September 2007
Zurich15363 Posts
March 17 2019 23:49 GMT
#20056
It took like 10 years for a consensus to build that relu works better than other activations. To quote a lecture I heard on deep learning some years ago: "We use relu because it works better. Why does it work better? We don't know. If someone tells you that they know, they are lying."
ModeratorI know Teamliquid is known as a massive building
Equalizer
Profile Joined April 2010
Canada115 Posts
March 18 2019 03:17 GMT
#20057
@travis I do not see why what I mentioned doesn't accomplish what you describe. By having a negative weight it is equivalent to have a unit producing a negative output.

Consider the following,
output = w_1 * A + w_2 * B + w_3 * max(A + B - 1,0) + bias
where A,B in [0,1]

By setting w_3 as an appropriate negative weight you can apply any negative contribution from the event of A and B to the output that is needed.

Note: max(A + B - 1,0) is just ReLu with weights of 1 and bias of -1.
The person who says it cannot be done, should not interrupt the person doing it.
Deleted User 3420
Profile Blog Joined May 2003
24492 Posts
March 18 2019 23:07 GMT
#20058
You're right, im dumb, you can. Thank you!
Manit0u
Profile Blog Joined August 2004
Poland17693 Posts
March 19 2019 21:03 GMT
#20059
Time is precious. Waste it wisely.
JimmyJRaynor
Profile Blog Joined April 2010
Canada17329 Posts
Last Edited: 2019-03-20 19:24:53
March 20 2019 19:18 GMT
#20060
On February 21 2019 12:21 Manit0u wrote:
Screw math and ML.

to your point...
In general, "Machine Learning" is being oversold by its proponents.

"I apparently have a bit of a reputation as someone who is anti-machine learning or anti-AI when it comes to human research. This is a bit of misrepresentation of my views, and (I'd argue) a misrepresentation of the issues "statistics people" take with AI/ML as a whole."

"I personally think that AI/ML has a lot to bring to the table to enhance science, health and human performance. The problem is that the AI/ML crowd are over-selling their wares and often being disingenuous about what is current state-of-the art"

"Issue 1: CLAIMING EVERYTHING IS MACHINE LEARNING. Just because AI/ML may use algebra or linear regression, doesn't mean it is AI/ML. Same goes for Nonlinear regression, correlation, logistic regression, or everything else that IS STATISTICS (or information theory, etc.) "

"It's cool if you use statistics and statistical concepts properly. Really, we're a big-tent kind of people. Just don't claim you invented something you clearly did not. And no, stringing together multiple correlations in an automated way doesn't make it extra special. "

"Issue 2: OMG THE HYPE MACHINE, MAKE IT STOP. Again, a lot of really good stuff is being trialed with AI/ML. You don't need to oversell the genuinely good work and advances being pioneered. Here's the thing, most "statistics people" are allergic to hype."

"Many human research statisticians work in areas of health where people can die or receive in appropriate treatments if we do our job wrong. It isn't to say we're perfect, but we work hard to be conservative and criticize our models so we're confident in the results."

"This is, I think, the main reason statisticians have issues with the AI/ML crowd: we can smell snake oil. The really good and avant garde AI/ML work gets lumped in with the utter nonsense directed at VC's and the pop media."

"Issue 3: THE SNAKE OIL IS SPREADING (tweet #7) Again, there is good AI/ML work being done, but most of it is just re-branded statistics or 'stuff' hiding behind the term "proprietary". This snake oil is leaking into government, academia, etc in an attempt to be 'cool'"

"We're seeing this salesmanship more-and-more outside of traditional AI/ML technology circles. There are conference presentations or academic papers that call things like principle component analysis AI/ML... it was invented in 1901! https://en.wikipedia.org/wiki/Principal_component_analysis … "

"Finally, many "stats people" are interested in applying AI/ML techniques and seeing where it can compliment our backgrounds and current work. We just get turned off by the bravado, hype and salesmanship that accompanies AI/ML. So yeah, we'll keep giving it a hard time. "

Ray Kassar To David Crane : "you're no more important to Atari than the factory workers assembling the cartridges"
Prev 1 1001 1002 1003 1004 1005 1032 Next
Please log in or register to reply.
Live Events Refresh
WardiTV Team League
12:00
Group B
WardiTV574
IntoTheiNu 13
Liquipedia
Sparkling Tuna Cup
10:00
Weekly #123
Classic vs CreatorLIVE!
CranKy Ducklings107
LiquipediaDiscussion
[ Submit Event ]
Live Streams
Refresh
StarCraft 2
Tasteless 869
Rex 110
StarCraft: Brood War
Sea 53633
Calm 14712
firebathero 6185
Horang2 2371
GuemChi 2159
Jaedong 1565
BeSt 624
EffOrt 512
Mini 467
Stork 356
[ Show more ]
Soma 284
Rush 225
actioN 162
Dewaltoss 116
Last 113
Mind 94
ToSsGirL 77
Sea.KH 55
Backho 51
Barracks 40
JulyZerg 38
Hm[arnc] 35
IntoTheRainbow 28
sorry 27
Nal_rA 25
GoRush 20
ivOry 9
SilentControl 9
Icarus 8
Terrorterran 6
Dota 2
Gorgc7540
XaKoH 227
BananaSlamJamma113
canceldota100
Counter-Strike
byalli675
x6flipin344
kRYSTAL_40
Super Smash Bros
Mew2King90
Heroes of the Storm
Khaldor320
Other Games
B2W.Neo2092
Liquid`RaSZi919
Fuzer 186
Hui .61
Organizations
Dota 2
PGL Dota 2 - Main Stream28968
Other Games
gamesdonequick870
ComeBackTV 264
StarCraft: Brood War
lovetv 16
StarCraft 2
Blizzard YouTube
StarCraft: Brood War
BSLTrovo
sctven
[ Show 16 non-featured ]
StarCraft 2
• musti20045 28
• Adnapsc2 8
• CranKy Ducklings SOOP7
• AfreecaTV YouTube
• intothetv
• Kozan
• IndyKCrew
• LaughNgamezSOOP
• Migwel
• sooper7s
StarCraft: Brood War
• blackmanpl 28
• iopq 1
• BSLYoutube
• STPLYoutube
• ZZZeroYoutube
Dota 2
• C_a_k_e 1728
Upcoming Events
Patches Events
3h 15m
BSL
6h 15m
GSL
18h 15m
Wardi Open
22h 15m
Monday Night Weeklies
1d 3h
WardiTV Team League
1d 22h
PiGosaur Cup
2 days
Kung Fu Cup
2 days
OSC
3 days
The PondCast
3 days
[ Show More ]
KCM Race Survival
3 days
WardiTV Team League
3 days
Replay Cast
4 days
KCM Race Survival
4 days
WardiTV Team League
4 days
Korean StarCraft League
5 days
uThermal 2v2 Circuit
6 days
BSL
6 days
Liquipedia Results

Completed

Proleague 2026-03-13
WardiTV Winter 2026
Underdog Cup #3

Ongoing

KCM Race Survival 2026 Season 1
Jeongseon Sooper Cup
BSL Season 22
RSL Revival: Season 4
Nations Cup 2026
ESL Pro League S23 Finals
ESL Pro League S23 Stage 1&2
PGL Cluj-Napoca 2026
IEM Kraków 2026
BLAST Bounty Winter 2026
BLAST Bounty Winter Qual

Upcoming

CSL Elite League 2026
ASL Season 21
Acropolis #4 - TS6
2026 Changsha Offline CUP
Acropolis #4
IPSL Spring 2026
CSLAN 4
Kung Fu Cup 2026 Grand Finals
HSC XXIX
uThermal 2v2 2026 Main Event
NationLESS Cup
Stake Ranked Episode 2
CS Asia Championships 2026
IEM Atlanta 2026
Asian Champions League 2026
PGL Astana 2026
BLAST Rivals Spring 2026
CCT Season 3 Global Finals
IEM Rio 2026
PGL Bucharest 2026
Stake Ranked Episode 1
BLAST Open Spring 2026
TLPD

1. ByuN
2. TY
3. Dark
4. Solar
5. Stats
6. Nerchio
7. sOs
8. soO
9. INnoVation
10. Elazer
1. Rain
2. Flash
3. EffOrt
4. Last
5. Bisu
6. Soulkey
7. Mini
8. Sharp
Sidebar Settings...

Advertising | Privacy Policy | Terms Of Use | Contact Us

Original banner artwork: Jim Warren
The contents of this webpage are copyright © 2026 TLnet. All Rights Reserved.