Parsing webpages

After spending all my LostPets allocated time working on coursework projects I now have time to work on it again over the Christmas break.

As I only have access to my laptop and not the desktop with android studio I’ll be working on server side aspects. I wrote a python script to read the contents of an html table using BeautifulSoup. Using this I will be able to pass names and locations (postcode accurate) to my database, meaning the map screen will actually start to display real lost pets.

Further steps include adding more websites, facebook pages etc to the database with similar methods and automating that process to occur at set intervals.