Skip to content

Conversation

@bretwalker
Copy link
Member

This is an example of what's required to set up a new scraper.

If this repo is cloned,lgeku_scraper.py can be replaced with another location-specific scraper, like the one in this PR.

instance_id and view_id come from the site’s HTML: https://outagemap.georgiapower.com/external/default.html

owner and repo need to be set to the repo where the outage JSON should be written.

The only other thing to do is to add a GitHub token to the cloned repo's Actions so that it can write to the destination repo.

  1. Create a personal access token using an account that has access to the destination repo: https://help.github.com/en/github/authenticating-to-github/creating-a-personal-access-token-for-the-command-line
  2. Add it to GitHub actions, giving it the name GH_TOKEN: https://help.github.com/en/actions/automating-your-workflow-with-github-actions/creating-and-using-encrypted-secrets

The action should have been copied over and should be picked up and execute an Action every 15 minutes, scraping and saving the current outages.

@bretwalker bretwalker changed the title Add Georgia Power scraper Georgia Power scraper example Dec 19, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants