Geospatial information analysis normally requires pretty complex calculations and transformations between different representation types. The Google Maps APIs are a great tool because they hide all the complexity of these operations. However when the geospatial information that you need to analyse is not from Google Maps, things get more complicated.
Operations like finding polygons representing geographical places, or finding polygons that intercept other polygons, require a lot of time and intricate code to have any acceptable solution.
At Shine we are working on a solution for a big telecommunication customer that requires being able to query large geospatial databases without degrading performance.
Shine excellence recognised as Innovation Finalist in NAB Supplier Awards 2014
Shine Technologies is proud to be acknowledged for excellence as one of three Finalists in the NAB Supplier Awards 2014, Innovation category. This acknowledgement demonstrates Shine’s commitment to being a partner over being a vendor.
Shine is proud to have been awarded the Computerworld Data+ award for our work with Google BigQuery. The innovative work is a great example of using great technology to deliver business benefit. You can see the write up of the award here:
Congratulations to the Shiners on the team: Luke, Pablo, Graham and Kon!
Shine Senior Developer Ben Teese will be speaking at YOW! Connected next week.
Held in Melbourne, Australia on September 8 and 9, YOW! Connected covers both Mobile Development and The Internet of Things.
Ben’s talk topic is ‘The State of the Mobile Web’ and is scheduled for 3:30pm on Tuesday. It’ll include advice about when to go web and when to go native, best practices for mobile web development, and a discussion of hybrid mobile apps.
If you’re attending the conference and run into Ben, be sure to say Hi!
Posted in Mobile, News, Web
When writing complex software things don’t always go as planned. You implement a new feature that works perfectly well locally and in a test environment, but when your code hits the real world everything falls apart. Sadly, that’s one of the things we all have to deal with as software developers. On a recent project for a major telecommunications client we needed to be able to process more than 20 million records every night. That equated to 5GB of storage and unfortunately the environment where our process was running had up to 4GB of memory.
Processing such a vast amount of data brings a lot of challenges along with it, especially when you also need to combine it with a few more million records that are located in a database. Adding code to retrieve associated information and transform raw data might take an extra few milliseconds per each record. However, when you repeat that operation 20 million times those few milliseconds can easily turn into hours.
So there we were asking ourselves why it was taking so long. Is it an index we forgot to add to the database? A network latency problem? These things can be very hard to pin down.
We needed to think outside the box to get around this one.
The Kick-Off Meeting
It went something along the lines of:
- Client: “We have a new requirement for you..”
- Shiners: “Shoot..”
- Client: “We’d like you to come up a solution that can insert 2 million rows per hour into a database and be able to deliver real-time analytics and some have animated charts visualising it. And, it should go without saying, that it needs to be scalable so we can ramp up to 100 million per hour.”
- Shiners: [inaudible]
- Client: “Sorry, what was that?”
- Shiners: [inaudible]
- Client: “You’ll have to speak up guys..”
- Shiners: “Give us 6 weeks”
We delivered it less than 4.