The simulation theory in summation is the hypothesis that we are all living in such a high-functioning program that we do not know it. This simulation of reality is difficult to disprove, just as much as it is difficult to disprove the Flying Spaghetti Monster is doing its work right now. I've discussed this theory many times and it always makes for a compelling conversation. There are times that I even think about this theory quietly to myself, especially at night. During these moments, I consider the many possibilities that would accompany the simulation theory if it were true. This is my theory regarding it and why.
The other day after having a very vocal chat about charter schools with a friend, I picked up my phone to find that it had news articles about charter schools on my Yahoo home page and I immediately thought about the invasiveness of targetted content. This set me toward thoughts about the simulation theory as if it were real, in such a case then does that not mean that it too could have targetted content for whatever reason as in sales data and the like? It may be plausible that along with the simulation theory, the simulation itself could be corrupted or bent to sell or influence inhabitants to buy things. With that in mind, we may be all just wallets to build up and drain who are cycling through this simulated system.
On a less alarming note, with the simulation possibly being a giant targeted ad, it could allow its inhabitants to live the life that they want the most. The things we want may be what we focus on more frequently and eventually get, much like the saying "The grass is greener where you water it." The person who spends their life being obsessed with cars will likely have a life that revolves around cars. When applied to the simulation theory being a big targetted ad, this would mean that the artificial intelligence component of it would adapt to what is important or valued to the individual and alter its content accordingly. Making the life of an inhabitant of the simulation feel purpose and value for their actions or ideals.
This is my lullaby, the thought that maybe my life is a long simulation that I am influencing what I do want within the perimeters of this reality by what I work towards the most. Sure I can not start to fly like a deity, but I could take an interest in a goal or task like putting french doors in my living room or learning another language and it could be true someday. Maybe in this lullaby, the things that go wrong are when these preference inputs are out of bounds and will reset as soon as possible, like when the targetted ads on your phone are not aligning correctly with your wants but will resolve soon. With this type of thinking I can temporarily pretend that everything is happening for a reason and that the inputs are aligning to lead me to my next preferred experience.
תגובות