Skip to content

A simple checker that confirms that photos being uploaded have no pornographic content.

Notifications You must be signed in to change notification settings

iamsingularity/protected-uploader

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 

Repository files navigation

Protected Uploader

This is a simple checker that confirms that photos being uploaded have no pornographic content.

To run:

pip install git+git://github.com/Clarifai/clarifai-python.git
export CLARIFAI_APP_ID=<application_id_from_your_account>
export CLARIFAI_APP_SECRET=<application_secret_from_your_account>

Note: If you don't want to export these variables, you can add the application ID and secret in the arguments of the new Clarifai object.

Once you've got this, you're all set up! Run the following in your terminal to see if the given image is Safe for Work.

python protected.py <your_image_url>

Here's an example of what my terminal looks like from start to finish after I've downloaded the project and opened the terminal in the project folder:

$ pip install git+git://github.com/Clarifai/clarifai-python.git
$ export CLARIFAI_APP_ID=abc123xyz
$ export CLARIFAI_APP_SECRET=abc123xyz
$ python protected.py http://i.imgur.com/YCqQW0W.jpg
Safe for work!

About

A simple checker that confirms that photos being uploaded have no pornographic content.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%