Project Progress (18/02/2018)

It’s been a little while since I last posted, so it’s about time to update all of you on the progress I have been making with my AR Battleships game, and I think it’s coming on pretty well if I do say so myself!

I have been making quite a bit of progress with the practical side of this project, bringing together a lot of the different features into one Unity project. Scarily, we only have 8 weeks left to complete the project which isn’t a lot of time, however I’m confident this will be a high standard piece of work will also be something fun and innovative by the end of it. I’m going to break this post down into the different aspects of work I have been doing such as 3D modelling, problems that I have overcome with the AR grid, voice recognition testing and networking.

Watson Speech-To-Text

The first piece of technology I implemented into this project was the Watson Speech-To-Text service as this was the one I was least comfortable with. After looking at a few examples and tutorials online, I managed to create a simple demo showing it working, allowing the user to highlight a chosen grid cell by saying its name, e.g. “A3”.

At first, it worked perfectly fine, however I forgot to bear in mind I was testing in a quiet environment and was using a headset with good microphone quality, so when it came to show other people in different places, it took a while for the application to recognise the words or just straight up not work at all.

The initial thing I overlooked was a property attached to the Speech-To-Text class called “Silence Threshold”, which controls the level of volume it deems as being “silent”. Originally this was set to be very low, which caused the huge amounts of lag when testing outside as it was picking up all the background noise, so I raised it to be substantially higher, fixing a lot of the problems and allowed the application to be used outside.

Another issue I saw happening a lot was the Speech-To-Text service getting confused between ‘C’, ‘E’ and ‘D’. This was happening because all these letters rhyme and sound very similar, which caused a lot of frustration when trying to choose one of these cells. After some thought, I decided the best way to get around this problem was to start using the phonetic alphabet as the keywords to listen for when saying the letter for a cell, as they are all very distinct words and can be easily memorised when playing. I would ideally use Watson’s machine learning to counteract this problem, so you can just say the letters normally, however I don’t have much time to figure out and understand the nature of how this would work, so using the phonetic alphabet is a good working solution for now.

New Grid

The original grid I created for the first demo was good for testing the Watson Speech-To-Text service, however it didn’t look great and when referring to pictures of battleships grids it was wrong, as they have the labels for the cells on the outside of the grid as opposed to having the labels on the cells themselves. With this information I decided it was a good idea to create my own grid cells using a 3D modelling package (In my case I used Blender) because this will let me keep the scaling of the models uniform and will help me down the line when I need to split the battleship models up between the cells.

I had never practiced any 3D modelling before so thought it was the best time as any to get started, learning the basics from different youtube channels. I mainly followed a channel called Blender Guru (I’ll link his channel here), as he has a great way of explaining the different tools and techniques you must use to create 3D models within the software. After a few hours of practice I was able to create a simple grid cell model, a ‘boat component’ model (to allow me to test the boat placement) and start a very basic boat model that I will use when the placement of the boats on the grid has been perfected.

Boat Positioning

After getting the Speech-To-Text service up and running and creating the new grid cells, I moved onto the AR aspect of the project. With Unity’s 2017 release, they have included Vuforia (my chosen AR API), as a built-in feature that you can toggle on/off, making it really simple to set up and get working correctly.

The first big programming task, after getting the grid displaying correctly through AR, was to implement the ability to position the ships on the grid at the start of the round. I decided to use touch input and raycasting to control the positioning of ships as I thought using the voice commands to do this could be quite clunky and long winded, however if I have some time at the end of the I project I may include this feature.

It seemed like an easy task at first however I ran into a lot of problems with things like scaling, keeping track of free cells and the rotation of the ships. After a few hours of work however, everything seemed to work perfectly fine and I was happy with the result. I have made it in such a way that I can have any shape (a cross, T shape etc.) as well as just straight shapes (like the usual battleships boats), and the positioning mechanic will work for all of them, so If I wanted I could use some different shapes to make the gameplay a bit more interesting.

A video of this showing how it works can be seen below.

Grid Scaling

An issue I saw early on in prototyping was that the overall scale of the AR grid would have to change depending on the environment the user is playing in, as sometimes the playing space could be a large room or on the other hand a small table on a train. Due to this potential problem I decided to implement an easy solution, making all the objects that are being augmented children of a “scaling” parent, which the user can scale using a simple slider UI within the options menu, making everything in the augmented scene scale uniformly. This means that the user can change the scale at any point during the game and will not affect any gameplay aspects.

Networking

The last big part to this project is the networking side of things. Networking is always a scary feature to think about as a lot can go wrong due to things like connection speeds, package loss etc. and I personally don’t have too much experience with it. However, the good things about this project is that it is a turn-based game, making it drastically easier to implement online play compared to a real-time multiplayer game were you have to worry about de-syncing issues and disconnection.

At this moment in time I haven’t implemented any online functionality to my application, however I have gone through the Unity examples and other tutorials, bringing together all the information I need to go ahead and get started, so hopefully in a week or two I can start creating more of the gameplay side of things and start testing it against other players! How exciting!

I am positive this will come with it’s issues that I will have to work around/overcome so you will have to wait and see how it goes in my next blog post, which should hopefully be out in about 2 weeks.

See you then!

 

 

 

 

Leave a comment