Chapter 2: Time Loops
rating: +27+x

In the first chap­ter, we dis­cussed time travel in mostly in­for­mal terms, to help in­tro­duce the sub­ject and give a feel for time travel. This chap­ter con­tin­ues this, with ad­di­tional in­for­mal tech­niques for an­a­lyz­ing time loops. In the next chap­ter we will in­tro­duce a more for­mal ap­proach, but for the time be­ing it is more im­por­tant that the reader gain a more prac­ti­cal un­der­stand­ing first.

The Boot­strap Para­dox

In or­der to un­der­stand how time loops work, it is nec­es­sary to in­tro­duce a no­tion of the ‘con­nect­ed­ness’ be­tween world lines. While presently, every known world line is, one way or an­other, reach­able from every other known world line, it is the­o­rized that in the pri­mor­dial uni­verse, there may have been ad­di­tional world lines lead­ing to the cur­rent set, but with­out any way back. These ex­tra the­o­ret­i­cal world lines, while un­ob­serv­able, are very im­por­tant for ex­plain­ing the boot­strap para­dox, and in pre­dict­ing what types of time loops are likely to have formed.

As an ex­am­ple, con­sider this loop:


The up­per line here is the spu­ri­ous line that bootstraps the time loop. Al­though it does not receive a mes­sage, it spon­ta­neously de­cides to trans­mit mes­sage $M$ to the past. The past receives this mes­sage, and then also trans­mits the same mes­sage $M$ to the past, form­ing a time loop.

In some cases there may be more than one po­ten­tial spu­ri­ous world line that could have lead to a given loop. For ex­am­ple, take this sec­ond-or­der loop:


(Note that re­ac­tion dis­place­ments have been omit­ted from this di­a­gram in or­der to sim­plify it.)

In this case, the main world lines $B$ and $B'$ each trans­mit mu­tu­ally ex­clu­sive mes­sages $M$ and $M'$, each prompt­ing the other to send its mes­sage. In this case, the sit­u­a­tion could have been boot­strapped by ei­ther $A$ or $A'$.

This can eas­ily be gen­er­al­ized to an ar­bi­trary num­ber of cases:

Each world line $B_k$ trans­mits a dis­tinct mes­sage $M_{k+1}$ that then prompts $B_{k+1}$ to send its own $M_{k+2}$, up to $B_n$, which trans­mits $M_1$ again, caus­ing the cy­cle to re­peat. In this case, the en­try point could have been any of these, de­pend­ing on what mes­sage $A$ trans­mit­ted ini­tially.

Exercises

  1. Draw a com­plete time­line di­a­gram for a ba­sic fourth-or­der pe­ri­odic time loop.
  2. Ad­vanced In some cases, a given world line may be part of a time loop more than once. Draw a time­line di­a­gram for a loop in which world line $C$ sends and re­ceives mes­sages from both $B_1$ and $B_2$, with­out any di­rect com­mu­ni­ca­tion be­tween $B_1$ and $B_2$.

Es­ti­mat­ing Loop Struc­ture

In many cases, it may only be pos­si­ble to ob­serve some por­tions of a time loop. But even in these cases, it may still be pos­si­ble to in­fer part or all of the loop's struc­ture from the por­tion you can ob­serve, by treat­ing the po­ten­tial struc­ture as if it were a Markov chain and solv­ing the cor­re­spond­ing sto­chas­tic ma­trix.

For ex­am­ple, take a case where each world line is send­ing and re­ceiv­ing a mes­sage that is ei­ther $M_1$, $M_2$, or $M_3$. Be­fore re­ceiv­ing their mes­sage, they flip a coin. If heads, they will add 1 to the mes­sage and send it, un­less it's $M_3$, in which case they will just send $M_3$. If it's tails, how­ever, they throw the mes­sage away and just send $M_1$. What is the prob­a­bil­ity of re­ceiv­ing each of these mes­sages?

If we di­a­gram out all pos­si­ble tran­si­tions on our time­line, we get the di­a­gram above. This can be then writ­ten out as a sto­chas­tic ma­trix:

(1)
\begin{align} B = \begin{bmatrix}B_1&B_2&B_3\end{bmatrix} = B\;\begin{bmatrix}0.5 & 0.5 & 0 \\ 0.5 & 0 & 0.5 \\ 0.5 & 0 & 0.5\end{bmatrix} \end{align}

This could be solved al­ge­braically, but in this case it is eas­ier to just sim­u­late it - in this case it con­verges af­ter only two it­er­a­tions. The re­sult­ing as­ymp­totic prob­a­bil­i­ties are 0.5 for $M_1$, and 0.25 for $M_2$ and $M_3$.

Exercises

  1. Gen­er­al­ize the ex­am­ple prob­lem to five mes­sages. Draw out the com­plete time­line di­a­gram and com­pute the prob­a­bil­ity of re­ceiv­ing each mes­sage.
  2. Ad­vanced Solve the time loop in 2.1 exercise 2. Be­cause of the way world line $C$ par­tic­i­pates in the loop mul­ti­ple times, this should af­fect its prob­a­bil­ity dif­fer­ently than in a sim­ple loop.

The Lot­tery Prob­lem

We are now equipped to un­der­stand the lot­tery prob­lem pre­sented at the be­gin­ning of Chap­ter 1, and un­der­stand why, as we as­serted, you are only slightly more likely to win than by ran­dom guess­ing.

You receive a win­ning lot­tery num­ber from the fu­ture. What is the prob­a­bil­ity of win­ning the lot­tery us­ing that num­ber?

First off, some ba­sic in­for­ma­tion about lot­ter­ies:
The se­lec­tion meth­ods used by mod­ern lot­ter­ies are ex­tremely sen­si­tive to even very small changes, and the act of trans­mit­ting the mes­sage to the past will very likely de­stroy any cor­re­la­tion be­tween the num­ber drawn in the trans­mit­ting and re­ceiv­ing world lines. How­ever, the In­ter­me­di­ate Value The­o­rem from cal­cu­lus guar­an­tees that at there is at least one mes­sage that, if trans­mit­ted, would end up be­ing cor­rect. The mes­sage might need to in­clude some ex­tra ran­dom data along with the num­ber, but for the sake of ar­gu­ment it works to as­sume that it's not nec­es­sary in our case.

Be­cause it's im­pos­si­ble to de­ter­mine be­fore­hand the ex­act mes­sage that needs to be sent, the best pos­si­ble strat­egy that the spu­ri­ous lines can take is brute force, where each pos­si­ble mes­sage is sent in turn un­til it works. Then once we have the work­ing mes­sage, each it­er­a­tion af­ter than can just re-trans­mit that same mes­sage.

How­ever, each time, there is also a nonzero prob­a­bil­ity that you might fail to trans­mit the next mes­sage cor­rectly, be it a tran­scrip­tion er­ror, ran­dom soft­ware glitch, or any­thing else. Be­cause the num­ber of steps you will likely have to take in or­der to ar­rive at the cor­rect lot­tery num­ber, this prob­a­bil­ity of fail­ure builds up and com­pounds on it­self, since you only need to fail once to pre­vent ever reach­ing the cor­rect num­ber. And since the chance of mak­ing no mis­takes over the bil­lions or tril­lions of it­er­a­tions re­quired to con­verge to the right num­ber, us­ing the num­ber re­ceived from the fu­ture only gives you a very slight edge over ran­domly guess­ing.

Ex­er­cises

  1. As­sum­ing 100 steps, and a 1% chance of mak­ing an er­ror that re­sets to step 1, com­pute the ex­pected value for the length of the loop, and find the prob­a­bil­ity of reach­ing step 100.
  2. Ad­vanced De­rive the gen­eral equa­tion de­scrib­ing the prob­a­bil­ity of reach­ing step $n$ given a uni­form fail­ure prob­a­bil­ity $p$ at each step.

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License