Skip to content

jabraham17/Sunhacks2019

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

The Nooz: Bias in the Media

Award

Winner of the best domain, thenooz.com

Inspiration

In the technological age of fake news there seems no way to avoid it. It becomes even more of an issue when people aren't aware of other people's or their own biases and subconscience opinions. That's where the idea for this website comes in. It tells you news source's publishing biases and helps to improve your ability to identify and work around this issue by gathering a collection of news media for personal analysis, thought, and consumption from all political sides and standpoints.

What it does

"The Nooz" scan the entirety of the most popular news outlets from some of the most liberal like MSNBC to some of the most conservative like Fox News as well as some of the most neutral like the Wall Street Journal to some of the most biased like Info Wars for breaking news. A search capability is integrated into the platform so the user can become educated not only about the facts but the way the information is presented by all of the media.

How we built it

The first step was to formulate the idea and create a development plan. We decided to split up our application into two main development sections: front end and back end. This way, we could maximize efficiency and utilize the the full capacity of our team's individual strengths. We used an agile software development process to incrementally build small pieces of the application in such a way that each piece was a functional component of the app. In the back end we had to research and individually create a web scraper for each of the 14 most popular news websites we used to compile the articles and gather its data. We created our own rest api to send data between our back end and the front end. The first essential step was to actually draw out and design the web page of the front end. This design was created react js and python to achieve the design we had in mind. The next step was containerization so all element and components of the screen were divided into individual containers. Finally, the last step was connecting the front and and the back end with http requests to hit the api we created. We hosted both ends on AWS

Challenges we ran into

There were a large variety of challenges we ran into while creating this project, most of which had to do with our lack of initial knowledge on specific topics and the continual customization necessary to complete our desired project. The first issue we had was setting up AWS. None of us have had substantial experience with AWS, which was necessary to connect our graphic front-end and our python heavy back-end components on our rest API. We had to overcome this obstacle by teaching ourselves with enough time to still test the functionality and reliability of our code. In addition to this, majority of the configuration tutorials and online help weren't up to date or delivered the information in a way that took us a while to implement, creating an entirely new hurdle to jump over. The second major issue was the sheer number unique differences each news source presented itself with. In order to use beautiful soup to properly parse the HTML code from the websites in python, we needed to be able to scrape each individual online source with accuracy, precision, and speed. Each website had its own HTML code which required each source to be uniquely analyzed for patterns in the HTML elements to not only gather every single news article but to detect and disregard ads, videos, and duplicate media. Not only was this a challenge in and of itself, but at the beginning our team had a very limited knowledge of HTML, so we had to overcome the issue of combing thought the entire code to find patterns in a new coding language.

What's next for The Nooz- Bias in the Media

There are many aspects that in the future we would love to add. The main addition that we really wanted to include was a personal bias section. This section can be seen modeled in our Profile tab. Essentially, what would happen is that after signing in, the user can view articles via the website and every view would be cataloged in a database staring all of the user’s history. From there, the average of all news source visits would be calculated and stored in a visual format where the user can see their own bias based upon there viewing preferences.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •