19 June 2017

AR and VR in data visualisation – can it ever be useful to our puny human minds?

The Register has just published a post on VR and Dataviz featuring quotes from our MD highlighting the fact that VR may be more suitable for the "communicate" phase of a dataviz exercise than the "explore" phase.

The bit about "spent six months trying to get his firm's software to work in VR, but eventually decided to stick with monitors" is not 100% true - we just switched focus from native Oculus to WebGL, which then gave us the added benefit of browser support - as well as Cardboard (that people can actually afford!). We'll soon support Oculus through WebVR, and if there's the interest a Unity based player that reads Datascape output.

Also the bit about "it just needs a good engine to prepare the data first." is spot on though - and applies to every data viz system, 3D, VR or plain old 2D! We still spend more time massaging data than we do actually generating the visualisation from cleaned and enriched data.

Otherwise the general thrust is right - we think VR is better suited to the end-of-pipeline task of sharing and communicating your data. If you do want to use 3D (which for a lot of uses cass we think you should) then you're better off doing it on an ordinary monitor but in a 3D (flight-sim style) environment - that you can work in all day without getting nauseous and whilst still being able to communicate with colleagues. And then of course just click the Publish button to generate a web and VR version to share!

Why not download Datascape now to give both modes a try!

Birmingham Open Data - Traffic Levels

For some time we've been meaning to plot some of the data coming out of Birmingham City Council's excellent open data initiative. So today we finally got around to downloading some datasets from their Open Data Factory - and there certainly seems to be a lot of good and usable data there.

The first dataset we've tried is the annual vehicle traffic counts for about 160 sites across the city. The only real issue was that locations were given a Nation Grid References, so we did a simple linear conversion to Lat/Long based on some known co-ordinates in the city. Since some of the data points represent 4km of road we don't think that any error is significant!

We used a simple geotemporal plot, with a discs for each year's data stacked on top of each other - so each site produces a column of varying width discs - the width/radius being proportional to traffic levels. To aid in immediate visual understanding we also mapped traffic levels to colour in a simple heat map.

The resultant visualisation is at: http://live.datascapevr.com/viewer/?wid=4e5d4cd4-c987-43a2-bdda-24ef747bc57b

Just click the link to fly through in 3D in your browser, or 3D/VR on your smartphone + Google Cardboard.

The most immediate comment from the visualisation is how little the data has changed over 15 years. There is no major sense of traffic levels around the city blooming. Some minor increase in some of the sites - but by no means in all. It's also obvious that the M6 and A34(M) are, hardly surprisingly carrying the biggest traffic loads, and then down through the Bristol Road. The main arterial routes are next.

Using 3D to stack the data does also help to highlight artefacts from data collection - something that Datascape always appears to make easy to find. In this case it's sites like the one below, for instance, where a single sensor is replaced by two sensors in order to get better resolution.

There are also some quite complex changes, such as when the M6 toll opened in 2003, with one sensor being replaced by several, and then some further consolidation.

We can also see significant changes in inner city monitoring with several sites being phased out.

And finally this M6 sensor appear to show a massive drop (111497 to 19753), but could be due to a change in the A47/J5 layout?.

 Other M6 sensors don't show a big drop post 2003/M6 Toll so it's unlikely to be that - in fact none of the M6 sensors show any big post-toll change except possibly a minor drop, soon recovered for the ones straight after the junction. Here's just the M6 sites for reference.

Daniel, Humza and Julio - Work Experience

During 5-9 Jun we played host to three Yr10 students from Aston University Engineering Academy students on work experience. Two of them (Daniel and Humza) had already spent a week with us earlier in the year. Here is their collective report (verbatim).



Chatbot & Chatscript

Monday - We were assigned our tasks and our mentors to help us with our tasks. Adam showed me what a chatbot is and an example of one. Then he showed me the coding behind the bot.
Tuesday - I started to write some of my own code for the Henry Bot. Just some of the basics. Towards the end of the day Adam asked me to think of a topic for me to program my own bot on.
Wednesday - I couldn’t think of a topic so Adam chose one for me. I then had the rest of the day to code my bot and occasionally getting help from Adam. The image below shows me chatting to the bot.

Thursday - I was close to finishing my bot, Adam wanted me to start learning HTML and CSS coding, so I could create a webpage to talk to the bot online.
Friday: Adam was not here so I was not able finish my web page for the bot. Overall, I have had a good week and learnt a lot of skills that I can use later in life.



Monday - Out of the three options we were given, I chose to complete the Fieldscapes task. For this I had to download the Fieldscapes app onto my phone and was given a few hours to test out most of the maps and report any bugs I found. Alongside this, I gave feedback of suggestions or any improvements they could make. The image shows some of my suggestions.

Tuesday - I had the task of searching for 3D models of props that would replace existing props. However, while looking for models I had to consider the different file types that Unity accepts and if they fit into the criteria.
Wednesday -  On Wednesday, I created a word document that explains to a new user how each widget works and what it does. For example, the Tape Measure. For me to complete this task, I had to use Fieldscapes and try out each widget. Towards the end of the day, Humza and I created videos showing how to play Fieldscapes on mobile.
Thursday - After David purchased the props, I imported them into Unity and began resizing and texturing them. However, I mainly focused on an idea that David came up with regarding an intractable chess board. This required me to texture, resize and add a collider to each chess piece. I resized the new props by using various old props as reference to what size ‘roughly’ they should be.
Friday - On the last day, I finished up previous scripts and wrote my blog.


Amazon Alexa Skill & Javascript

Monday - On the first day at Daden I had to learn how to code with JavaScript, this meant that I had to use code academy and complete the course on JavaScript

Tuesday - On the next day with my new knowledge of JavaScript I was given the task of producing a new skill for Alexa, this was supposed to take in a question which would then output a response, the coding was in JavaScript. The skill was based on the fact that Alexa had to say, “Hello world” due to a trigger which would have been my question. The image is a screen shot of the Alexa skill building app.

Wednesday - Being halfway through the week I was tasked with exploring a new objective which was to use Zap works to create some augmented reality content. This website allowed the user to create a storage of the information which a marker would trigger the “pop up” of the context which could either be images, videos, soundtracks, or information which would be linked to websites. There were different stages that had different levels of difficulty in the making but there was more manipulation over the content which meant that augmented reality could be created.

Thursday - On Thursday I carried on with the building of three new intents for Al
exa, this time alone, when I finished this I had to learn how to use switch statements to reduce the amount of code there was and to make the code more efficient, once this was done I had to change the utterances of the skills to make grammatically correct so that it makes sense to the person saying the skill. Finally, I had to find out why the response that Alexa gave started with “undefined” but I didn’t manage to do it so I had to use a replace method in order to replace undefined with “The answer is ”. The image shows a switch statement I wrote in my Alexa skill.

Friday – On the last day of my work experience I had to create this blog as my final task.

15 June 2017

Fieldscapes Android App now on Play Store

Fieldscapes is now available for Android. The Android app will let you play any* Fieldscapes exercise. It supports both 1st and 3rd person views, and single and multi-user modes, and even flying!

You must have a (free) Fieldscapes account in order to use it.

You can download the app from the Play Store by searching for Fieldscapes, or view the page at: https://play.google.com/store/apps/details?id=com.DadenLimited.Fieldscapes

Video at: https://www.youtube.com/watch?v=OXeBDZW1yYA

Thanks to our work experience students Humza, Daniel and Julio for putting the raw video footage together and acting as hand models!

*Location and Inventory creators must upload separate Android versions of their assets in order for an exercise to work on Android - but this is a 5 minute process, and does not need to be done for each exercise.

8 June 2017

Fieldscapes and Trainingscapes Live

Today we're pleased to announce that Fieldscapes is formally live.

This represents a culmination of over 2 years work that started with the InnovateUK Design for Impact Feasibility Study, and was then followed by Development funding from InnovateUK through until October last year - and then our closed and open beta since then.

During the Design for Impact project we worked closely with our partners The Open University, the Field Studies Council and service design consultancy Design Thinkers, numerous schools and universities, and some lovely physical field-trips, to create a service (not just the software!) that would let educators create and share 3D and VR immersive field-trips - and almost any other lesson - without needing specialist 3D skills. We've done this by clearly separating the two key tasks involved - creation of the 3D assets (which is likely to remain a technical task for some time to come - but not beyond the ability of a keen amateur), and the creation of the lesson plan and pedagogy.

For this second element we think we've created an easy to use and intuitive forms based, what-you-see-is-what-you-get editor, where educators can just walk out onto their chosen location, place the props which the student will interact with, and then define the interactions that the student will have.

The resulting lesson can then be accessed, in single or multi-user mode, from a PC, Mac or Android (iOS) to come, and also with an Oculus Rift VR headset if available (other VR headsets to follow).

To us, though, the key is that once you have created a lesson you can share it with other educators, anywhere in the globe. And you can give them permission to copy your exercise so that they can customise it to the needs of their students, and put it in their language.

So please, check out our videos of how to use Fieldscapes, see our gallery of example locations and lessons (and we stress these are examples only - we want you to be the creators of content, not us!), and then register and download the app and try out existing lessons and start to create your own. We only charge once you start to open exercises up to your students.

For those who want a dedicated system rather than a shared system - be they educators or commercial trainers  or L&D staff - we are also now offering Trainingscapes, the same technology as Fieldscapes but offered as a dedicated instance for each client, branded to that client, and where the client is in complete control over access. We'll have a dedicated Trainingscapes demo shortly, but in the mean time sign-up for Fieldscapes to get a feel for the system.

If you need any help in getting started we're running regular webinars, but also please don't hesitate to email support@fieldscapesvr.com and we can set up a Skype or similar session to get you going.

And if you'd like more information on Fieldscapes or Trainingscapes, or a face-to-face or web demo, then again please just call us on +44 121 250 5678, or email info@daden.co.uk or use the contact form.

26 May 2017

General Election: 2015 Data - Turnout Data

Link for fly-through: http://live.datascapevr.com/viewer/?wid=c26b2318-ec14-457a-9ae5-dc64bc0fc6e1

Just testing out a minor update for Datascape when I decided to plot the 2015 General Election results data on a scatter plot, not geographically. I chose:

  • X: electorate
  • Y: Margin/majority
  • Z: Turnout
  • Size: Size of UKIP vote
  • Colour: Winning party

A quick fly around of the data revealed a number of interesting points:

In the high margin/majority space it is the Conservatives that dominate. More big majority, safe homeland seats.

It is at the small majority end where we tend to see the smaller parties - few of them have massive majorities in their seats.

 The SNP wins are generally associated with very high turnouts. The dots (yellow) are small as there was minor/no UKIP voting and that is what we've mapped to radius.

And the most striking - looking at the turnout axis the Conservatives dominate in the high turnout seats and Labour in the low turnout seats. Cause or effect?

Update: 7 June 17

Here's another visualisation we really like - just showing the vote for each of the main parties across all the constituencies. The fly-through link is:


18 May 2017

London Undergound Data in Datascape

Inspired by the TfL talk at VRWorld I thought it was about time we got some TfL data in Datascape, so I managed to put this together before I got home from the conference.

Doogal's site has a downloadable list of tube station locations (lat/long) (https://www.doogal.co.uk/london_stations.php) which was a good start. The TfL Open Data site after registration then let me download the station entrance counts for weekdays for 2016 (as daily averages I assume) for 15 minute slots through the day. A bit of Perl scripting combined the two files, and split the TfL table into a single column, and some tidying to get station names to match! Then into Datascape. In all these visualisations height is the 24 hour day, each disc is a data sample for a 15 min period, and width and colour is proportional to the number of people entering the station at the time.

You can fly through a subset of the data yourself in 3D or VR (Cardboard) by following this link in any WebGL capable browser: 

Full dataset, the column with a fat base at the centre of the map is Waterloo

Flying in to Oxford Circus - bulging at home time

Flying out along the Northern Line - note some stations have an evening bump as well as a morning one

Showing a sample every 3 hours in the Web/VR version to get general trend

Close up of Web/VR version - we limit to 5-10k points at the moment for performance

We'll explore some of the other TfL datasets over the next few months, and we can support multiple datasets in a visualisation so you could switch between entrance and exit data, or weekdays and weekends.

Replacing the Open Street Map map with a 3D model is certainly possible - we can do model imports ourselves but haven't added it yet as a public function. And moving the whole experience into Hololens is certainly something we'd like to do in the future.

But for now you can just download a trial of Datascape for free and start publishing your data visualisations in 3D on the PC, web and VR.

A short unedited video flythrough is below - but try the link at top for the full 3D/VR experience.

All the data is (c) 2017 Transport for London and used under the Open Government Licence v2.0.

17 May 2017

VR World Report

I spent the last couple of days at VR World down in London. The event was busy pretty much the whole time and the 3 open lecture areas had a good mix of talks and discussions. There were also a good number of exhibitors showing  a range of AR/VR apps and kit, although nothing that really blew me away.

Photo report below, but main takeaways were:

  • Almost more AR/MR (Hololens) than VR
  • Far more Vive than Oculus (can understand that from a developer's perspective)
  • A few Cardboard based apps
  • Nobody showing Gear VR - whereas previous events I've been to have been full of them
  • A few haptic input devices, but no-one showing gloves
  • Still a lot of 360 video/photosphere stuff
  • Does stringing together a set of other peoples VR videos count as a presentation?
  • Some people had really been drinking the VR Kool-Aid with the "this will change the world by 2020" type speeches and stats - it won't, it's just another tool
  • Very few people showing analysis frameworks of how this all fits together
  • A few people showing some good evaluation stats, even more calling out for everyone to share them - been calling for that for ages
  • Nobody really doing data visualisation
  • Just one company doing authoring - and more a Unity-light approach for simple photosphere menus
  • More doing training then education

Now the photos:

How VR can fill the gap in medical training

Some nice promo type work from JauntVR

And some nice stats about impact

Nice guidelines on MR (and VR) development from Viscopic

Some great data from Touch Surgery - and this was 3D not VR surgery training - surgeons did better than trainees - so valid

The learning effect - with repetition people got better

Proper control group testing

More improvement in the group using the 3D trainer

A touch interface - but one interface too many on the demo rig?

A physical labyrinth explored with Vive and backpack PC

Fracture showing some nice Hololens demos of city data - see below

Affordable and easily integratable slippery feet walk controller - may well integrate with Fieldscapes 
Hollywood production values in Rolls Royce robot ship control demo

Very neat though - and no-one wearing any headsets!

Better view of Fracture in the TfL demo - see next blog post

10 May 2017

2015 General Election Results - Risk vs UKIP

Click link for live 3D fly-through: http://live.datascapevr.com/viewer/?wid=31a96f33-6196-4dcb-b408-4e1c6fa6aee9

We've started to play around with election data in Datascape. We're not political analysts, so please forgive any errors/simplicity in the analysis - we're just seeing if we can get Datascape to give us some interesting views and perspectives on the data.

In this visualisation we've plotted the 2015 results for each constituency (exc NI - the ONS file doesn't have the geocodes).

The key is:

  • Colour = winning party
  • Height = 1/margin between 1st and 2nd place
  • Width = Number of UKIP votes
Broadly speaking if any column has an obvious thickness to it then the number of UKIP votes is at least the size of the majority/margin. One of the current lines of analysis appears to be that many of the UKIP votes will go Conservative - so Labour (red)  seats with a reasonably small majority (almost any appreciable height on our visualisation), and with a large UKIP vote (any width) could be good candidates to go Tory.

In the office we're going to start to work from this and other visualisations our predictions of which seats are going to switch, and if we're brave we'll post it here the day before the election!

Just click on the link above to fly through the data, and hover over any bar to see the detail. If you've got Google Cardboard then you can even fly through the data in VR!

Note: You might see that there's a UKIP win over the Greens in Buckingham in middle England - which didn't happen! The reason is that the ONS/Electoral Commission data for that seat appears to be missing entries for the 3 major parties (and others) - so according to the data it's a UKIP win. This actually highlights Datascapes ability to immediately draw the eye to errors in the data!

9 May 2017

Trainingscapes Website Live

Our Trainingscapes website is now live. This is the professional/commercial/vocational counterpart to Fieldscapes. Whereas Fieldscapes is a single, shared service aimed at educators, Trainingscapes - using the same core technology - is delivered as standalone instances for clients, branded to them, and run either hosted on on their own servers.

The Trainingscapes web site also has case studies of our work over the last decade in 3D immersive training on a wide variety of projects, many of which were delivered on earlier versions of the PIVOTE authoring engine which underlies Fieldscapes and Trainingscapes.

You can check out the new web site at www.trainingscapesvr.com.

8 May 2017

General Election data in Datascape #1

With the General Election looming in the UK we've been after getting consitituency location data into Datascape so we can do some good plots of the historic, poll and results data. Finally tracked down as good source at the ONS Open Geography portal - thanks guys.

Above plot shows each constituency centre point and a rough idea of its size. We're not doing detailed boundary stuff yet - would be hard to see for whole country anyway given the huge range in sizes - most 2D maps use pop-outs for the major cities.

There will no doubt be issues over a) boundary changes and b) non-standard spellings of constituency names - but at least it gives us a start!

16:39 Update

And first plot already - merged in electorate (height) and 2015 turnout (colour).

Click on the link below to launch in your browser or view in MobileVR on your smartphone/Cardboard.