AI and Big Data have both been boons to the corporations that use them. They make it simpler for data analysts to understand the interests of the demographics being addressed, and they can help companies large and small improve the appeal of their products and advertisements. There has been, however, a natural backlash against the use of AI and Big Data. Many sources have cited the use of both – especially when users are unaware of their presence – as unethical and invasive. This is a genuine concern worth considering as AI and Big Data are used more frequently in companies of all sizes and interests.
Certain ethical lines need to be considered, and those at the head of particular companies need to have an open dialogue with the audience they’re serving in order to ensure that both AI and Big Data are being used effectively and ethically. At the moment, there are certain quirks of the deep learning algorithms and network tapping that read a little too 1984 for some folks. In considering these aspects, so too can solutions be considered that will better balance the relationship between technology and the individual.
Deep Learning Algorithms
Deep learning algorithms work within the existing field of machine learning by mirroring the neural pathways of the human brain so that they can process large swaths of data more quickly and efficiently. This process advances much of the data analysis already done by larger corporations, but it seems to many to be distinctly invasive. Deep learning algorithms can come to understand a user on an uncomfortably intimate level and can, in turn, predict certain user behaviors. When utilized by a search engine, for example, deep learning algorithms can tailor ads to an individual’s interests along with the sources that individual is linked to.
Deep learning algorithms didn’t start out to operate on this assessing level, however; in some cases, these algorithms were first installed as safety measures meant to identify threats to national security. However, the evolution of the algorithms has led them to impact not only the economic aspects of an individual’s life, but the social, as well. The gathering of knowledge performed by these algorithms, partnered with their apparent evolution, can be a bit intimidating. Companies looking to implement such algorithms will want to deliberate consider not only how the use of said algorithms will impact their business, but how the algorithms may manipulate their users. In many cases, the “deep learning” and evolutionary nature of the algorithms may need to be capped in order to ensure users’ privacy.
These deep learning algorithms can access individual data and facilitate the collection of Big Data at large through network tapping. Network tapping enables the pass-through and monitoring of data; individual data is assessed, and if particular content catches the attention of a savvy algorithm, the content and user can be flagged for monitoring. Passive network TAPs take the standard network tapping to a new level by continuing to operate without using electricity. How would a company go about combating the potential misuse of these taps? The establishment of set company values and implementation of checks and balances along the installation and monitoring systems can prevent blatant disregard for individual privacy. The method of content flagging does raise an interesting question, however: just how far are some individuals willing to go in the name of overall safety? Who, furthermore, becomes the subject of more intense monitoring – is there a risk for racial or religious profiling? - and how can a company ensure that the systems will not be misused?
Technologies such as AI and the algorithms behind Big Data analysis will not go away with time; if anything, they’re set to advance and help establish corporate norms far into the future. What individuals working in these industries need to do is understand the potential ethical risk of network monitoring and deep learning algorithms and, in turn, work to prevent the abuse of such systems. While it is unlikely that users will ever go entirely unmonitored, open communication throughout a company, strong overall values, and a willingness to adapt will ensure that individual users will maintain the rights to their privacy while also living and working in a secure environment.