• Log InLog In
  • Register
Liquid`
Team Liquid Liquipedia
EST 01:07
CET 07:07
KST 15:07
  • Home
  • Forum
  • Calendar
  • Streams
  • Liquipedia
  • Features
  • Store
  • EPT
  • TL+
  • StarCraft 2
  • Brood War
  • Smash
  • Heroes
  • Counter-Strike
  • Overwatch
  • Liquibet
  • Fantasy StarCraft
  • TLPD
  • StarCraft 2
  • Brood War
  • Blogs
Forum Sidebar
Events/Features
News
Featured News
RSL Revival - 2025 Season Finals Preview8RSL Season 3 - Playoffs Preview0RSL Season 3 - RO16 Groups C & D Preview0RSL Season 3 - RO16 Groups A & B Preview2TL.net Map Contest #21: Winners12
Community News
Weekly Cups (Dec 1-7): Clem doubles, Solar gets over the hump1Weekly Cups (Nov 24-30): MaxPax, Clem, herO win2BGE Stara Zagora 2026 announced15[BSL21] Ro.16 Group Stage (C->B->A->D)4Weekly Cups (Nov 17-23): Solar, MaxPax, Clem win3
StarCraft 2
General
RSL Revival - 2025 Season Finals Preview Weekly Cups (Dec 1-7): Clem doubles, Solar gets over the hump Chinese SC2 server to reopen; live all-star event in Hangzhou Maestros of the Game: Live Finals Preview (RO4) BGE Stara Zagora 2026 announced
Tourneys
RSL Offline Finals Info - Dec 13 and 14! Tenacious Turtle Tussle 2025 RSL Offline Finals Dates + Ticket Sales! Sparkling Tuna Cup - Weekly Open Tournament StarCraft2.fi 15th Anniversary Cup
Strategy
Custom Maps
Map Editor closed ?
External Content
Mutation # 504 Retribution Mutation # 503 Fowl Play Mutation # 502 Negative Reinforcement Mutation # 501 Price of Progress
Brood War
General
BGH Auto Balance -> http://bghmmr.eu/ How Rain Became ProGamer in Just 3 Months [BSL21] RO8 Bracket & Prediction Contest BW General Discussion FlaSh on: Biggest Problem With SnOw's Playstyle
Tourneys
[ASL20] Grand Finals [BSL21] RO8 - Day 2 - Sunday 21:00 CET [BSL21] RO8 - Day 1 - Saturday 21:00 CET Small VOD Thread 2.0
Strategy
Simple Questions, Simple Answers Game Theory for Starcraft Fighting Spirit mining rates Current Meta
Other Games
General Games
Dawn of War IV The 2048 Game Path of Exile Stormgate/Frost Giant Megathread Awesome Games Done Quick 2026!
Dota 2
Official 'what is Dota anymore' discussion
League of Legends
Heroes of the Storm
Simple Questions, Simple Answers Heroes of the Storm 2.0
Hearthstone
Deck construction bug Heroes of StarCraft mini-set
TL Mafia
Mafia Game Mode Feedback/Ideas Survivor II: The Amazon Sengoku Mafia TL Mafia Community Thread
Community
General
Russo-Ukrainian War Thread Things Aren’t Peaceful in Palestine US Politics Mega-thread YouTube Thread European Politico-economics QA Mega-thread
Fan Clubs
White-Ra Fan Club
Media & Entertainment
Anime Discussion Thread [Manga] One Piece Movie Discussion!
Sports
2024 - 2026 Football Thread Formula 1 Discussion
World Cup 2022
Tech Support
Computer Build, Upgrade & Buying Resource Thread
TL Community
TL+ Announced Where to ask questions and add stream?
Blogs
How Sleep Deprivation Affect…
TrAiDoS
I decided to write a webnov…
DjKniteX
James Bond movies ranking - pa…
Topin
Thanks for the RSL
Hildegard
Customize Sidebar...

Website Feedback

Closed Threads



Active: 1516 users

a fun math puzzle

Blogs > evanthebouncy!
Post a Reply
evanthebouncy!
Profile Blog Joined June 2006
United States12796 Posts
June 25 2015 05:02 GMT
#1
a curious puzzle for those who want to have some fun.
https://gist.github.com/evanthebouncy/bde3e510e562da5db6a2
take a look at this code and try to run it, what it does is it creates some random functions f1,f2,...,f100, and then form a chain by composing them together, and the result is a (random) constant function! how can it be?
Life is run, it is dance, it is fast, passionate and BAM!, you dance and sing and booze while you can for now is the time and time is mine. Smile and laugh when still can for now is the time and soon you die!
Xxazn4lyfe51xX
Profile Joined October 2010
United States976 Posts
June 25 2015 07:42 GMT
#2
As far as I can tell, it's not really a constant function. As n grows larger, the function approaches a random, constant value. For the purposes of the interpreter they give us, the point at which the function gives a constant number to the number of digits the output reads is about around n = 16.

I'm not good enough, or at least not patient enough with math to figure out why exactly it is that putting numbers through increasing numbers of random functions tends towards constancy though.
fixed_point
Profile Joined June 2011
Germany4891 Posts
Last Edited: 2015-06-25 09:52:00
June 25 2015 09:27 GMT
#3
By order arguments, one can show that at each value of n, the result of the composition is more dependent on the random numbers you generated rather than the initial x.

This is because at the first iteration f_1 you've already mapped any x to a number between 0 and 1. Then look at the series expansion of 1/(1+e^(-x)) (respectively 1/(1+e^(x)) ) for n=2 onwards and you'll see that the terms depending on x become negligible since the powers of 0 < f_1(x) < 1 vanishes quickly unless f_1(x)=1 is already a constant function.

So yeah, the result is (almost) never a constant function, but damned close to one. Caveat: if your random number at any point is zero the function becomes constant.
evanthebouncy!
Profile Blog Joined June 2006
United States12796 Posts
June 25 2015 11:35 GMT
#4
On June 25 2015 18:27 fixed_point wrote:
By order arguments, one can show that at each value of n, the result of the composition is more dependent on the random numbers you generated rather than the initial x.

This is because at the first iteration f_1 you've already mapped any x to a number between 0 and 1. Then look at the series expansion of 1/(1+e^(-x)) (respectively 1/(1+e^(x)) ) for n=2 onwards and you'll see that the terms depending on x become negligible since the powers of 0 < f_1(x) < 1 vanishes quickly unless f_1(x)=1 is already a constant function.

So yeah, the result is (almost) never a constant function, but damned close to one. Caveat: if your random number at any point is zero the function becomes constant.


it's fitting that someone with a name like "fixed point" should answer this
lol
but yeah I know it's not a constant function, it's just faster to say it that way.

can u elaborate a bit on the derivative argument (serieis expansion)? derivatives and function composition has a strong relationship via the chain rule, so we can take the derivative of the huge chain and get a big multiplications (which approaches 0 in our problem), is that what you were trying to say?
Life is run, it is dance, it is fast, passionate and BAM!, you dance and sing and booze while you can for now is the time and time is mine. Smile and laugh when still can for now is the time and soon you die!
evanthebouncy!
Profile Blog Joined June 2006
United States12796 Posts
June 25 2015 11:37 GMT
#5
On June 25 2015 16:42 Xxazn4lyfe51xX wrote:
As far as I can tell, it's not really a constant function. As n grows larger, the function approaches a random, constant value. For the purposes of the interpreter they give us, the point at which the function gives a constant number to the number of digits the output reads is about around n = 16.

I'm not good enough, or at least not patient enough with math to figure out why exactly it is that putting numbers through increasing numbers of random functions tends towards constancy though.


the way I reasoned with it is consider a pair of numbers x, y
and push the numbers through the chain, i.e.
x, y
f1(x), f1(y)
f2(x), f2(y)
...

and try to prove that in each step the distance between x and y is closer together

once you're done with that, u will be able to show then for any distinct values (x, y) they will converge together after a sufficiently long chain, thus the chain is close to a constant function
Life is run, it is dance, it is fast, passionate and BAM!, you dance and sing and booze while you can for now is the time and time is mine. Smile and laugh when still can for now is the time and soon you die!
fixed_point
Profile Joined June 2011
Germany4891 Posts
Last Edited: 2015-06-25 13:18:03
June 25 2015 13:16 GMT
#6
On June 25 2015 20:35 evanthebouncy! wrote:
Show nested quote +
On June 25 2015 18:27 fixed_point wrote:
By order arguments, one can show that at each value of n, the result of the composition is more dependent on the random numbers you generated rather than the initial x.

This is because at the first iteration f_1 you've already mapped any x to a number between 0 and 1. Then look at the series expansion of 1/(1+e^(-x)) (respectively 1/(1+e^(x)) ) for n=2 onwards and you'll see that the terms depending on x become negligible since the powers of 0 < f_1(x) < 1 vanishes quickly unless f_1(x)=1 is already a constant function.

So yeah, the result is (almost) never a constant function, but damned close to one. Caveat: if your random number at any point is zero the function becomes constant.


it's fitting that someone with a name like "fixed point" should answer this
lol
but yeah I know it's not a constant function, it's just faster to say it that way.

can u elaborate a bit on the derivative argument (serieis expansion)? derivatives and function composition has a strong relationship via the chain rule, so we can take the derivative of the huge chain and get a big multiplications (which approaches 0 in our problem), is that what you were trying to say?

Derivatives are unnecessary. I can write more details with greater clarity in latex when I get home.
DucK-
Profile Blog Joined January 2009
Singapore11447 Posts
June 25 2015 15:01 GMT
#7
I thought this was pure math. Seems like I need some programming knowledge to understand :\
TanGeng
Profile Blog Joined January 2009
Sanya12364 Posts
June 25 2015 19:40 GMT
#8
I don't really know how fun that was. If you jam a number through 100 iterations of

shrink_fun(x) = 1 / (1 + e^-x)

You don't leave a lot of information about the original x at the very end.
Moderator我们是个踏实的赞助商模式俱乐部
fixed_point
Profile Joined June 2011
Germany4891 Posts
June 25 2015 21:26 GMT
#9
Heuristically this is why the function converges to a constant one.

PDF file on google drive
evanthebouncy!
Profile Blog Joined June 2006
United States12796 Posts
June 25 2015 23:34 GMT
#10
On June 26 2015 06:26 fixed_point wrote:
Heuristically this is why the function converges to a constant one.

PDF file on google drive


wow that's quite more involved than I expected.
but I'm sure it's second nature to you so good!!
Life is run, it is dance, it is fast, passionate and BAM!, you dance and sing and booze while you can for now is the time and time is mine. Smile and laugh when still can for now is the time and soon you die!
Kleinmuuhg
Profile Blog Joined September 2010
Vanuatu4091 Posts
June 26 2015 00:12 GMT
#11
On June 25 2015 20:37 evanthebouncy! wrote:
Show nested quote +
On June 25 2015 16:42 Xxazn4lyfe51xX wrote:
As far as I can tell, it's not really a constant function. As n grows larger, the function approaches a random, constant value. For the purposes of the interpreter they give us, the point at which the function gives a constant number to the number of digits the output reads is about around n = 16.

I'm not good enough, or at least not patient enough with math to figure out why exactly it is that putting numbers through increasing numbers of random functions tends towards constancy though.


the way I reasoned with it is consider a pair of numbers x, y
and push the numbers through the chain, i.e.
x, y
f1(x), f1(y)
f2(x), f2(y)
...

and try to prove that in each step the distance between x and y is closer together

once you're done with that, u will be able to show then for any distinct values (x, y) they will converge together after a sufficiently long chain, thus the chain is close to a constant function

sounds like a typical contraction to me
This is our town, scrub
fixed_point
Profile Joined June 2011
Germany4891 Posts
Last Edited: 2015-06-26 06:05:54
June 26 2015 06:00 GMT
#12
On June 26 2015 09:12 Kleinmuuhg wrote:
Show nested quote +
On June 25 2015 20:37 evanthebouncy! wrote:
On June 25 2015 16:42 Xxazn4lyfe51xX wrote:
As far as I can tell, it's not really a constant function. As n grows larger, the function approaches a random, constant value. For the purposes of the interpreter they give us, the point at which the function gives a constant number to the number of digits the output reads is about around n = 16.

I'm not good enough, or at least not patient enough with math to figure out why exactly it is that putting numbers through increasing numbers of random functions tends towards constancy though.


the way I reasoned with it is consider a pair of numbers x, y
and push the numbers through the chain, i.e.
x, y
f1(x), f1(y)
f2(x), f2(y)
...

and try to prove that in each step the distance between x and y is closer together

once you're done with that, u will be able to show then for any distinct values (x, y) they will converge together after a sufficiently long chain, thus the chain is close to a constant function

sounds like a typical contraction to me

If the domain is two fixed numbers, the functions are contractions. Generally you'll need a fixed constant 0 \leq \lambda < 1 for which the distance between any two f_1(x),f_2(y) is less than \lambda |x-y| for all x,y \in R. I'm not sure we have that here, but then again my brain doesn't work at this hour (need to take a look at the behaviour near x=0)
manicmessiah
Profile Joined June 2015
United States107 Posts
June 26 2015 07:33 GMT
#13
I have a feeling I should know what is going on, but I have no idea
fixed_point
Profile Joined June 2011
Germany4891 Posts
Last Edited: 2015-06-26 11:21:08
June 26 2015 10:59 GMT
#14
On June 26 2015 15:00 fixed_point wrote:
Show nested quote +
On June 26 2015 09:12 Kleinmuuhg wrote:
On June 25 2015 20:37 evanthebouncy! wrote:
On June 25 2015 16:42 Xxazn4lyfe51xX wrote:
As far as I can tell, it's not really a constant function. As n grows larger, the function approaches a random, constant value. For the purposes of the interpreter they give us, the point at which the function gives a constant number to the number of digits the output reads is about around n = 16.

I'm not good enough, or at least not patient enough with math to figure out why exactly it is that putting numbers through increasing numbers of random functions tends towards constancy though.


the way I reasoned with it is consider a pair of numbers x, y
and push the numbers through the chain, i.e.
x, y
f1(x), f1(y)
f2(x), f2(y)
...

and try to prove that in each step the distance between x and y is closer together

once you're done with that, u will be able to show then for any distinct values (x, y) they will converge together after a sufficiently long chain, thus the chain is close to a constant function

sounds like a typical contraction to me

If the domain is two fixed numbers, the functions are contractions. Generally you'll need a fixed constant 0 \leq \lambda < 1 for which the distance between any two f_1(x),f_2(y) is less than \lambda |x-y| for all x,y \in R. I'm not sure we have that here, but then again my brain doesn't work at this hour (need to take a look at the behaviour near x=0)

Nevermind, I'm an idiot. y=f_1(x) is between 0 and 1 and so f_100(f_99(...f(2(y)))) is a contraction. Therefore you can definitely use the contraction mapping/Banach fixed point theorem to prove that the composition/chain converges to a fixed point (i.e. constant number) for sufficiently large compositions. Pretty ironic, given my name...

However, I still think the series expansion is perhaps more intuitive for one unfamiliar with this theorem, since the original question asks for a finite sequence of compositions. (Of course from an abstract point of view, this is just a specific case of the fixed point theorem).
Dagobert
Profile Blog Joined July 2009
Netherlands1858 Posts
June 26 2015 12:11 GMT
#15
Having trouble with Berkeley homework?
fixed_point
Profile Joined June 2011
Germany4891 Posts
June 26 2015 12:42 GMT
#16
On June 26 2015 21:11 Dagobert wrote:
Having trouble with Berkeley homework?

I wish
evanthebouncy!
Profile Blog Joined June 2006
United States12796 Posts
June 26 2015 23:19 GMT
#17
On June 26 2015 21:11 Dagobert wrote:
Having trouble with Berkeley homework?


nah i graduated 3 yrs ago lol i'm doing a phd now.
this problem actually came up when i tried to run a randomly initialized neural network of 10 layers w/o training, and noticing it is always the constant function, so I thought it was really cool so I abstracted away the non-essentials and hope people would find the simplified problem interesting to look at.
Life is run, it is dance, it is fast, passionate and BAM!, you dance and sing and booze while you can for now is the time and time is mine. Smile and laugh when still can for now is the time and soon you die!
evanthebouncy!
Profile Blog Joined June 2006
United States12796 Posts
Last Edited: 2015-06-26 23:22:51
June 26 2015 23:20 GMT
#18
On June 26 2015 19:59 fixed_point wrote:
....
Pretty ironic, given my name...
....





that was my thought the entire time lawl
fffffffffffffffffffffffffffffffff(uck)
Life is run, it is dance, it is fast, passionate and BAM!, you dance and sing and booze while you can for now is the time and time is mine. Smile and laugh when still can for now is the time and soon you die!
Dagobert
Profile Blog Joined July 2009
Netherlands1858 Posts
June 27 2015 07:15 GMT
#19
Oh, that's more interesting than the problem here. What are you trying to get the neural network to do?
evanthebouncy!
Profile Blog Joined June 2006
United States12796 Posts
June 27 2015 23:24 GMT
#20
On June 27 2015 16:15 Dagobert wrote:
Oh, that's more interesting than the problem here. What are you trying to get the neural network to do?

nothing. just want to play with one for fun (cuz people kept talking about their fantastic learning properties). Just want to run it on the mnist dataset. Couldn't do back propagation on the deep one unfortunately, so i trained a shallow one with 94% accuracy, which is not bad i think.
Life is run, it is dance, it is fast, passionate and BAM!, you dance and sing and booze while you can for now is the time and time is mine. Smile and laugh when still can for now is the time and soon you die!
Please log in or register to reply.
Live Events Refresh
Next event in 2h 54m
[ Submit Event ]
Live Streams
Refresh
StarCraft 2
WinterStarcraft781
RuFF_SC2 189
trigger 28
StarCraft: Brood War
Shuttle 251
Noble 46
ZergMaN 33
Bale 29
Icarus 11
Dota 2
monkeys_forever435
League of Legends
JimRising 798
C9.Mang0137
Heroes of the Storm
Khaldor137
Other Games
summit1g7646
hungrybox330
Mew2King73
Organizations
Other Games
gamesdonequick1071
StarCraft 2
Blizzard YouTube
StarCraft: Brood War
BSLTrovo
sctven
[ Show 16 non-featured ]
StarCraft 2
• Berry_CruncH94
• practicex 33
• AfreecaTV YouTube
• intothetv
• Kozan
• IndyKCrew
• LaughNgamezSOOP
• Migwel
• sooper7s
StarCraft: Brood War
• Diggity5
• BSLYoutube
• STPLYoutube
• ZZZeroYoutube
League of Legends
• Rush1721
• Lourlo1333
• HappyZerGling76
Upcoming Events
Replay Cast
2h 54m
Wardi Open
5h 54m
Monday Night Weeklies
10h 54m
Sparkling Tuna Cup
1d 3h
OSC
2 days
YoungYakov vs Mixu
ForJumy vs TBD
Percival vs TBD
Shameless vs TBD
Replay Cast
2 days
The PondCast
3 days
OSC
4 days
CranKy Ducklings
5 days
SC Evo League
5 days
[ Show More ]
BSL 21
5 days
Sparkling Tuna Cup
6 days
BSL 21
6 days
Liquipedia Results

Completed

Acropolis #4 - TS3
RSL Offline Finals
Kuram Kup

Ongoing

C-Race Season 1
IPSL Winter 2025-26
KCM Race Survival 2025 Season 4
YSL S2
BSL Season 21
Slon Tour Season 2
WardiTV 2025
META Madness #9
SL Budapest Major 2025
ESL Impact League Season 8
BLAST Rivals Fall 2025
IEM Chengdu 2025
PGL Masters Bucharest 2025
Thunderpick World Champ.
CS Asia Championships 2025
ESL Pro League S22

Upcoming

CSL 2025 WINTER (S19)
BSL 21 Non-Korean Championship
Acropolis #4
IPSL Spring 2026
Bellum Gens Elite Stara Zagora 2026
HSC XXVIII
Big Gabe Cup #3
ESL Pro League Season 23
PGL Cluj-Napoca 2026
IEM Kraków 2026
BLAST Bounty Winter 2026
BLAST Bounty Winter Qual
eXTREMESLAND 2025
TLPD

1. ByuN
2. TY
3. Dark
4. Solar
5. Stats
6. Nerchio
7. sOs
8. soO
9. INnoVation
10. Elazer
1. Rain
2. Flash
3. EffOrt
4. Last
5. Bisu
6. Soulkey
7. Mini
8. Sharp
Sidebar Settings...

Advertising | Privacy Policy | Terms Of Use | Contact Us

Original banner artwork: Jim Warren
The contents of this webpage are copyright © 2025 TLnet. All Rights Reserved.