Cyber Threats in Self-Regulating Digital Platforms

Ohad Barzilay; Gal Oestreicher-Singer; Hilah Geva

Alongside the benefits of allowing computers to regulate systems, some risks arise. Computer algorithms may be susceptible to errors and manipulation. They may overlook corner cases and serendipities that they are not wired to detect, and they lack the “common sense” for “doing the right thing” in situations that are not covered by their cookbooks. Given the pros and cons, information technology stakeholders are facing a dilemma regarding the extent to which they should allow their technology to be intelligent and autonomous. This dilemma is becoming increasingly salient, as computer algorithms have become ubiquitous with the rise of the Internet of Things (IoT) and mobile computing.

In the proposed research, we focus on the economic value of the autonomy level of computer algorithms that regulate digital platforms. The platforms that we study are essentially intermediaries in two-sided markets, facilitating transactions between two parties: buyers and sellers (e.g., eBay); drivers and riders (e.g., Uber); entrepreneurs and their backers (e.g., Kickstarter); etc. In each domain, some platforms are considered more open than others, i.e. it is easier for a seller to put a product on the market; in those open markets, there are fewer criteria that a product must meet to be included, and the approval process is simpler, and usually faster. For example: the Google Play Store is considered more "open" than the Apple Store. The crowdfunding platform IndieGoGo is considered more "open" than its rival Kickstarter. The “openness” of such platforms is a result of the fact that they enable computer algorithms to screen the offerings submitted to them, sometimes without any human involvement, in contrast to other platforms, which rely mainly on human inspection.

As automated screening processes are more efficient than human-driven ones, they are likely to generate greater numbers of approved submissions (e.g., mobile applications or crowdfunding campaigns). This, in turn, may result in one of two contradictory scenarios: On the one hand, the platform’s users may find the variety of offerings on the automated (“open”) platform more attractive than the more limited set of options on the “closed” platform. On the other hand, the greater variety may come at the expense of maintaining the quality of the offerings. Algorithms approve products according to whether they meet some threshold criteria. Unlike a human, an algorithm might overlook defects that are not covered by its predetermined list of criteria, and therefore might approve products that are of low innate quality.

We draw on and add to two streams of literature: First, the work on two-sided markets and peer economy platform and, second, the literature on information flow on digital platforms.

Tel Aviv University makes every effort to respect copyright. If you own copyright to the content contained here and / or the use of such content is in your opinion infringing, Contact us as soon as possible >>
Tel Aviv University, P.O. Box 39040, Tel Aviv 6997801, Israel
UI/UX Basch_Interactive