Songle MP Video Demos

Co-op mode

Loading a Co-op game

In co-op mode players  work together to guess the song. Here a co-operative multiplayer game is loaded on the hardest difficulty. It can be seen that the bottom device has the same map as the top device.

Collecting in a Co-op game

In a co-op game lyrics collected by any player are shown to all others. Here the top player collects lyrics and uses his ‘Bowie Bolt’ special – collecting an additional three lyrics. These lyrics are all shown on their lyrics screen and the bottom players lyrics screen.

Guessing in a Co-op game

In a co-op game if any player guesses correctly all other active players are told that their team has won on the next update. Here the top player correctly guesses and so the bottom player is told that someone in their team guessed the song correctly.

 

 

VS mode

Loading a VS game

In a VS game players compete to be the first to guess correctly. Here a VS multiplayer game is loaded on medium difficulty. It can be seen that the bottom device has the same map as the top device.

Collecting in a VS game

Lyrics collected in a VS game are only shown on the collecting player’s lyrics screen. The marker however vanishes for other players, preventing them from. It can be seen that the lyric collected by the top player is shown on their vs screen but for the bottom player the marker vanishes and no lyric is shown on the lyrics screen.

Guessing in a VS game

In a VS game players who weren’t the first to guess correctly are shown a red cross. Here the top player correctly guesses the song and it can be seen that the bottom player is then told that the song has already been guessed by another player.

Archiving alpha change list as releasing soon

01/01/17 -> 18/01/17

Work done:

Successfully scraped a site and displayed everything about the pets on it (short of images for now) on my map view. Page scraping was achieved using python with urllib2 and BeautifulSoup, I then used geocoder to translate postcodes to co-ordinates for use in the map.

Further work required:

-Google login, shouldn’t be too challenging I don’t think.
-scrape images too, not looking forwards to this it seems like it could be quite bug prone.
-scrollview – not too difficult once images are working.
-add your pet – already have a working  dummy implementation, again just images.
-distance based notifications – planning to tackle this next, no idea how difficult to implement it will be and not sure how I’m going to test it.

 

12/11/16

Work done:

Coursework swamped and brain fried, spent an hour or two rethinking my deployment plan and minimum requirements. I read a lot about similar projects and it seems to be a better idea to get a barebones app onto the playstore then flesh it out from there, rather than painstakingly perfect it and risk losing interest. I set out to create an app that ‘sends the user a notification when within a preset distance of a recently lost pet’ and I will now try to create something that does exactly that with fewer bells and whistles.

Further work required:

Focus on functionality, find a few key sites to scrape, put those markers on and set the notifications. Register with Google rather than custom account. 4 main pages: a login, the map screen, a list of everything from the map in scrollable format and an ‘I’ve lost my pet’ screen.

 

30/09/16

Work done:

Added pet information class. The user performs a long click on the map view roughly where their pet was last seen. This takes them to the pet information class where they can input the relevant information about their pet. This is currently : Name, Dog/Cat, Description, Images*, Reward(optional). They then press ‘Add’ and are taken back to the map view.

*does not currently work

Further work required:

More time spent thinking about information fields. Learn how image uploading works and implement this. Create the relevant back end php file and table to store this information. Continue fixing import LatLongs to show all pet Locations+names. More UI work later, adding icons for cat/dog buttons.

Will try to fix LatLongs import and marker placement tomorrow.

 

31/09/16

Work done:

Fixed table co-ordinate and information retrieval and implemented this into the on screen map. Users can now upload a location and retrieve all other user’s uploaded locations. Other information fields are still to be implemented. New PHP file was written to achieve this. I attempted some very precise regex work but couldn’t quite get it working how I wanted it to, so I used a more general case to get the values I needed then some further default java formatting on top of that. Yesterdays goal for today was achieved. Spent some time researching Google’s new Map Styling to pretty the interface up a bit.

Further work required:

As yesterday mostly. Focus still on a creating a solid back end and it’s front end implementation, with cosmetics mostly saved for later.

 

08/10/16

Work done:

Implemented a cardview activity with all the same information as the map view just in a different format. This took far longer than it should have, started at 9 in the morning and its 11:45pm.

Further work required:

Still images into and out of database, home screen needs a massive redesign – I’ll probably focus on this next, plenty other things.

30/07/17

Work done:

Virtually finished. I want to add clustering to markers but other than that and some UX improvements I am happy to release.

Further work required:

As above.

Working on Lost Pets has been so fun and I’ve learned so much. I’m really looking forward to releasing and getting user feedback. Most of all I’m happy that I’ve been able to see a project through to completion.

Hope to update with some screenshots soon.

LostPets prerelease images and thoughts.

The new home screen:

I’ve since updated the status bar colour at the top to a dark red. This screen is a definite improvement and I feel looks quite neat.

The new map screen:

I definitely prefer it keeping with the colour scheme. Likely to change the floating action button. Main piece of work left is to cluster markers.

 

The new list screen:

Not 100% on the top banner but I’d like to keep the space available for messages in future. Images are loading in nicely.

 

Settings has been revamped and looks nice and tidy. Also added a more information and help page with an FAQ sheet.

All in all quite happy with it so far and looking forward to the release.

Report: A Natural Language Query System in Python/NLTK

In my second semester as part of my Processing Formal and Natural Languages course I had to complete two assignments. One realted to formal languages and one related to natural lanuages. This report covers the second, in which I scored 92%.

 

In this assignment, you will use Python and NLTK to construct a system that reads simple facts and then answers questions about them. You can think of it as a simple form of both machine reading and question answering.

Your completed system will enable dialogues such as the following:

$$ John is a duck.
OK
$$ Mary is a duck.
OK
$$ John is purple.
OK
$$ Mary flies.
OK
$$ John likes Mary.
OK
$$ Who is a duck?
John  Mary
$$ Who likes a duck who flies?

John
$$ Which purple ducks fly?
None

Sentences submitted by the user are either statements or questions. Statements have a very simple form, but the system uses them to learn what words are in the language and what parts of speech they have. (For example, from the statements above, the system learns that duck is a noun, fly is an intransitive verb and so on.) Questions can have a much more complex form, but can only use words and names that the system has already learned from the statements it has seen.
In Part A, you will develop the machinery for processing statements. This will include a simple data structure for storing the words encountered (a lexicon), and another for storing the content of the statements (a fact base). You will also write some code to extract a verb stem from its 3rd person singular form (e.g. flies → fly).
Parts B to D develop the machinery for questions. Part B is concerned with part-of-speech tagging of questions, allowing for ambiguity and also taking account of singular and plural forms for nouns and verbs. In Part C you are given a context free grammar for the question language, along with a parser, courtesy of NLTK. Your task is to write some Python code that does agreement checking on the resulting parse trees, in order to recognize that e.g. Which ducks flies? is ungrammatical. Agreement checking is used in the system to eliminate certain impossible parse trees. In Part D, you will give a semantics for questions, in the form of a Python function that translates them into lambda expressions. These lambda expressions are then processed by NLTK to transform them into logical formulae; the answer to the question is then computed by a back-end model checker which is provided for you.
Finally, in Part E, you are invited to supply a short comment on ways in which the resulting system might be improved.

 

 

Due to University regulations my code is only available on request, so if you’re interested at all please contact me: james@james-odonnell.com

Scrape by

Scraped a few sites with python and successfully managed to get that information onto my map.

Heres Alfie himself, he went missing with two other female Yorkshire Terriers – who have since been found in the Radford area of Nottingham. Alfie is sadly still missing.

There is still some work to be done here on top of the lack of images, either Nimbus is very lost or my geocoding is off:

Parsing webpages

After spending all my LostPets allocated time working on coursework projects I now have time to work on it again over the Christmas break.

As I only have access to my laptop and not the desktop with android studio I’ll be working on server side aspects. I wrote a python script to read the contents of an html table using BeautifulSoup. Using this I will be able to pass names and locations (postcode accurate) to my database, meaning the map screen will actually start to display real lost pets.

Further steps include adding more websites, facebook pages etc to the database with similar methods and automating that process to occur at set intervals.

CardViews

Added a new cardview activity. This allows the user to see all lost pets in a list view, to complement the map view. I noticed other apps with a similar location based premise did this so I thought I should implement it, and there are many advantages associated with it.

14610843_1098586613530228_488944237_n

Obviously still a lot of aesthetic work to do. The back end behind it works, but it’s convoluted and will need a rethink and simplification. Still need to work on pictures – with a mandatory aspect ratio so that it can be standardised across map markers and this cardview without any cropping issues. A mask over a user imported image they can move so it crops correctly?

Also that random card at the top needs to be removed…

 

Stored Location Success

After around 5 hours of work I finally managed to retrieve and display the locations of every ‘Pet’ in the database. I achieved this by learning and mashing together bits of PHP as well as fixing a few random errors and junk code left over from previous attempts.

The incoming data was then parsed as a JSON array, and then the relevant information displayed.

screenshot2

Further work includes also bringing names and descriptions through to the on screen map.

Lost Pets Back in development

Thought I may as well take advantage of having a public homepage and throw up some content on my app that’s currently in development, that I like to call LostPets.

The goal with LostPets is to create an app that sends a push notification to the user if they enter a certain radius of the last known location of a pet added to the database. This notification would have a picture, name  and owner contact information for that pet, so that if the app user happened to spot it while in the area then they would be able to contact the owner.

LostPets utilises Google’s Maps API to show pet locations.

screenshot1

There is a lot of work still to be done, particularly on the front end, I hope to have a working proof of concept finished by Christmas.