This article was last updated 5 years ago

Facebook

Facebook is a public forum and that means that it is prone to exploitation in all forms, by that very same public. To avoid such scenarios, the company is always finding ways to make the platform secure and uncover any bugs that might have gone unnoticed. However, this might be the company’s biggest bet till date. According to a paper released by Facebook researchers (via The Verge), Facebook is apparently looking to build a West World-esque fake platform that will only have bots. This platform will be used to run simulations and uncover hidden bugs in the “real” Facebook.

Company researchers are using the technology of “Web Enabled Simulation”(WES) to achieve this task. This world will be filled by bots, who will interact with each other the same way real users on the real app do. This means that the bots can like, comment, share, send friend request, and to take a darker turn, harass, abuse, and scam other bots. A Web-Enabled Simulation (WES) is a simulation of the behaviour of a community of users on a software platform. It uses a (typically web-enabled) software platform to simulate real-user interactions and social behaviour on the real platform infrastructure, isolated from production users.

Different bots will portray different natures. For example, a scammer bot may try to exhibit behaviors similar to real life scammers, and try to connect to bots that mimic behavior of real life victims.

Facebook has decided to call this system WW, which stands for WES World.

The WES technology is new to the world of software simulations . Instead of testing on a mockup version of the test subject(which is what was happening till now), this technology allows simulations on “something very close to an actual social media platform.”

The bots will not be performing actions similar to real users, like pressing the like button. Instead, they will trigger the same code which is triggered every time a real person presses a like.

Researchers have cautioned that “bots must be suitably isolated from real users to ensure that the simulation, although executed on real platform code, does not lead to unexpected interactions between bots and real users.”

Therefore, company states the there will be no interaction between the two versions of Facebook, and that bots will not get in touch with real people. Some of the bots could end up getting read only data from the real app, given the data being accessed does not violate privacy rules. This access will be provided to help the bots better mimic real life behavior. However, as is clear by the name, the data will be available to “read only”. Collected data will not be subject to any changes.

In the future, the company can use this technology to build bots that perform a selected type of function and gain intel from that. For example, if Facebook wanted to uncover bugs in the system that hackers can exploit to gain sensitive information about users. In that case, the company can just fill the WES World with bots that aim to hack said data.