Category Archives: Body Data

What will the future look like?

Future Shock

We were pleased to be able to attend Future Shock – the trends, technology and movements that should be defining the 2015 election (but aren’t), held by Nesta at Millbank Tower, London.

Siegfried and Roy?

This event, when advertised, felt like quite a smoke and mirrors affair, with the location and more specific information not being released until months after initial sign up. But it delivered on its promise of a solid discussion, with an array of panellists from the Financial Times, Channel 4, University of Oxford and Movement for Change.

Tomorrow’s world

Giving us their visions of the future was nothing like Tomorrow’s World (what we remember to be robot arms and hover cars) but more how economics could/should look (plus Paul Mason’s ideal – ‘Nationalise all grid infrastructure and introduce a universal basic income’); That it isn’t anything new to be afraid of change (just look at Orwell’s thoughts back in 1972) courtesy of Izzy Kaminska; That only the creative jobs won’t be automated in the future with a staggering 35% of jobs at high risk of automation (Thanks Dr Mike Osborne!!) and that the only way we can effect change is by closing the gap between people and politics from Kathryn Perera from Movement For Change.
They obviously all spoke a lot longer than the one point that I have credited to each them, but the overall feeling was that we need change, and we need it happen soon, but rather unfairly (I think) this won’t happen before next year’s elections or for some time yet. And that while we shouldn’t be afraid of this change we should approach it cautiously.

First session

We were very lucky to be given the option of one of three sessions in the morning and three sessions in the afternoon to choose from (and I would have loved to have gone to them all!) but being from a digital agency with a strong focus on health, I chose “The Big Shift for Health: How technology can empower patients and stop the NHS going bust.”

How do we empower patients?

First up we heard from John Loder, Senior Programme Manager for Nesta who said “In Healthcare we expect technology to make things more complex, not easier…”
So why do they expect that? It could be because the current structure of the NHS worked when it was first set up in 1943, but today, with patients having the information they want at their fingertips, the hierarchical structure doesn’t work.  The next more confounding problem is, what should this new structure look like? Surprisingly, no one’s actually sure, but the technology available and currently being developed should help us to frame it.
Take for example the work Apple is doing with its app ‘Health’, when the tech is there it will effectively aggregate all the important data that can be presented to a doctor to give a longer term picture of someone’s health, not just a quick test in a 5 minute appointment.

Three people/companies making a difference

1.    Also take the work of Dr Vanessa Diaz, who we met at the IBME event at UCL earlier this year. She is part of a team building a model that becomes a virtual version of you. This helps physicians to test scenarios on a patient’s virtual self without being subjected to anything physically. This way they can work out the best outcome and tailor a patient’s treatment to the person, not the condition.
2.    And this is precisely what Geneix (pronounced Gene-iks) is all about too (we saw Mark Bartlett, their CEO, talk at Future Shock), matching treatment to the individual not the condition using DNA and data. Using this method they can improve the overall efficacy of the medication along with patient adherence, because they are taking a medication that works for them. This can also reduce hospital readmissions, adverse reactions and reduce the cost of medications unnecessarily prescribed. Using data from your genes they can see if certain medications would work well for you, and where they wouldn’t alert the doctor before prescribing them, so you, the patient, suffers no adverse reactions, and hospitals or Doctor’s practices don’t exert their resources on fixing a situation that could have been avoided.
3.    Finally, Proteus Digital Health want to use your insides to get accurate data (we heard Barnaby Poulton, their regional director, talking at Future Shock). Using ingestible sensors, and powered by your stomach fluids, you can receive information; like when your medication was taken (you take it at the same time); heart rate; activity and rest. Sounds gimmicky? You’d think so, but actually when patients are given accurate real time data about themselves it empowers them to make changes to their behaviours, like the time they take their medication or their overall adherence. It also helps HCPs see in real times how the medication they have prescribed is working, and whether the patient is taking it at the optimum time for them.

These all sound good, but how do we work towards this change?

Well, therein lies the rub. Before we can get the NHS to use technology more freely in order to help patients, we have to look at three factors.
1.    Patients need access to their own data – this is so true, and so necessary to help people to make the right lifestyle choices and changes to their behaviours. They need to have something tangible they can access to track their health and see the difference, first hand that these behaviour changes make.  This isn’t just the likes of wearables, but also seeing what decisions have been made when and how in their care.  There is a huge disconnect between patients and their data, and that data following them from NHS trust to NHS trust. Given the technology we already have, it is surprising that we still don’t have access without making a data protection request.

2.    Doctors need the data too – not just HCPs, but researchers too, and we’re going someway towards this with the NHS care.data programme. Essentially anonymising data that is accessible to HCPs and medical researchers (and potentially pharma companies for a fee) so they can see overall health impact data. While the NHS has been collating data since 1980 about hospital admissions, they haven’t been able to follow the full care path, including items like test results and prescriptions, which are essential to HCPs to see what works and what doesn’t. However, people have the option to opt out of their information being shared (which is obviously their right), but in doing so it could damage the accuracy of information collated on wider issues, like public health issues. Also with the anonymisation of data, it means that researchers cannot contact the patient whose data they have, which is fine in terms of privacy, but if they found an actionable way of helping that person, or found evidence that the treatment the patient had was not the best for them, they could not help that person.

3.    The NHS/government needs to accept that they are not moving as fast as the exponential rate of change that the technology industry is moving at. They also need to accept that patients and HCPs will adopt technology more freely, be more willing to share their information (I’m talking millennials here, not baby boomers or older) and will want information faster, easier and more accurate than ever before.

With the election coming up in 2015, inevitably changes won’t be as fast as we would like and it is unlikely that the technology and data reforms we need for an effective NHS will form part of any party’s policies. With MPs today debating whether to repeal parts of the 2012 Health and Social Care Act – debating whether to further privatise the NHS or not – we’ll be waiting a while for a real health technology bill to be enacted.

The Nanoparticle Platform - nanoparticles attached to blood cells

Google [X]’s Nanoparticle Platform – Ground-breaking medical reality or sketchy Sci-Fi?

In the latest of Google [X]’s announcements it’s now transpiring that they are developing nanotechnology that can detect warnings of cancer, heart attacks and other diseases in a person’s body. Early detection is the key to treating diseases, and Google [X] hopes their nanoparticle platform can revolutionise health awareness.

Nanotechnology – Simples!

The idea, as Andrew Conrad, Head of Life Sciences at Google [X], says, is simple. “You just swallow a pill with the nanoparticles in (they’re decorated with antibodies or molecules that detect other molecules), they course through your body, and because the core of these particles are magnetic, you can call them somewhere.” He cites the wrist as an ideal calling point, given the superficial veins close to the surface. Placing a magnet there would bring all the particles together, and they aim to develop a wristband that can not only recall these particles from around the body, but also extract data from them.

It should be noted that the pill will not be as big, weird, or dangerous as this recent experiment in ‘smart pills,’ as nanoparticles are so tiny, thousands of them could fit inside a blood cell (a blood cell is 8 thousandths of a millimetre).

The nanoparticle platform would ‘report’ on the human body. Swallowing a pill releases nano-field workers, circulating inside the body, collecting information. When called together by the boss-magnet, they share data and compile a report on this particular body. This report is sent to external software for interpretation and diagnostics.

Google [X] currently seem to be devoted to proffering innovative solutions in health tech, as seen in their recent projects like glucose-measuring contact lenses for diabetics and the Baseline study. Dr Conrad has called this latest project in nanotechnology “the nexus between biology and engineering,” the idea being to “functionalise” nanoparticles to make them “behave in ways that we want them to do,” to gain a greater awareness of our body’s health.

So how does it actually work?

You may be asking. Well, so far Google [X] can only speculate on what they can actually make these nanoparticles do – development is still in very early stages. However, here’s what we can glean so far:

  • There are marked differences between healthy tissues and damaged or cancerous ones. Google’s ambition is that the nanoparticles will be able to identify and target these differences, then attach themselves to the damaged cells.
  • There will be lots of differently programmed nanoparticles tailored to match different cell conditions. They could be built to stick to a fragment of cancerous DNA or a cancerous cell, seek out evidence of fatty plaques in blood vessels, or flag up high levels of potassium (linked to kidney disease). These would work alongside another set that will constantly monitor the blood for unique traces of cancer, ensuring the earliest possible diagnosis.
  • Then, if that wasn’t complicated enough to develop, the circulating nanoparticles will somehow be able to retain the data they pick up from around the body when called together by the magnetic wristband.
  • Finally, by some other amazing technical feat, the nanoparticles transmit their data using non-invasive detection methods like light or radio waves to the wristband, which can present the findings as readable measurements.

You may have picked up on the incredulous tone here, and sadly, it reflects those of experts in the field.

Live long and prosper?

Although the concept of the nanoparticle platform itself may be simple to understand, at this stage it is just that – a mere concept, and turning it into a reality will not be as simple. Following last week’s announcement sceptics have already come forward, citing the enormity of the work, the body’s natural defence against foreign objects and the slightly fantastical hopes of Google as issues – but this just highlights why these projects are known as ‘moonshots’.

Indeed, as Chad Mirkin, director of the International Institute for Nanotechnology at Northwestern University, says, Google have described “an intent to do something, not a discovery or a pathway to get there.” He says that the technology is speculative, and at the moment the whole idea is basically “a good Star Trek episode.”

Another key concern is whether swallowing a pill of nanoparticles every day is all that safe. Developments in magnetic nanoparticle research for medical purposes have been going on for years, but a huge problem that comes up time and time again is their toxicity – hopefully Google [X] have something up their sleeves to solve this issue.

A Nano pill a day…

So, as we’ve come to expect from Google, their latest project is pretty ambitious. However, someone needs to be pushing the boundaries of health tech innovation, and it may as well be a company with the ideas and finances to support their endeavours.

If Google manage to pull this off (and within a decade they think they will) it could see a dramatic change in the way we interact with our healthcare providers. Conrad likened our current doctor-patient interactions as exploring Parisian culture by “flying a helicopter over Paris once a year,” whereas he believes the nanoparticle platform will allow “little particles go out and mingle with the people” in Paris, before being called back and asked “Hey, what did you see?” Essentially we wouldn’t need to go to the doctor and give blood or urine samples anymore; we’d simply swallow a pill to monitor our blood and upload the data into the cloud to send to our doctor. But could less face-to-face communication with your doctor really be a good thing?

The project is in exploratory phases for the time being, but Google [X] has already been talking about delivering medicine through the nanoparticle platform as well as abnormal cell detection. It seems the initial ideas are the focus for now though, and Conrad seems confident that they are firmly based in reality, saying: “we’ve done a number of promising experiments, so we’re going to keep going.”

phone displaying Figure 1

It Figures

Following recent developments in healthcare and technology (everyone’s getting in on it lately!), it’s no surprise that a new photo sharing app dubbed ‘Instagram for Doctors’ has been a great success, and is set to be rolled out across western Europe.

Figure 1’ is a medical photo sharing app that brings healthcare professionals (HCPs) together in a global online community to discuss and share medical images of patients (with their permission). HCPs can add cases by uploading photos, as well as make use of the reference image library, search images by anatomy or speciality, and join in discussions.

The app was founded by Canadian Doctor Joshua Landy, MD, who says: “We developed Figure 1 so members of the healthcare community could share images, knowledge, and clinical insight with each other, while safeguarding patient privacy.”

The idea is so simple it’s a wonder nothing like Figure 1 has been launched before. The only services close to what Figure 1 offer require subscription fees, whereas previous methods of file sharing between HCPs, such as email and post (yes, they still use post!) have proved inefficient and slow, while other digital sharing methods, like MMS or WhatsApp, raise security concerns. These older methods still only allow a few to access few images, but with Figure 1 an entire library and forum of over 50 million uploads is available at your fingertips, as is the collective knowledge of thousands of HCPs.

Co-operative Community

Figure 1 is free and currently available in North America, the UK, Ireland, Australia and New Zealand, with plans to expand quickly.

While anyone can download the app, only verified HCPs (who go through a rigorous identification process) can upload photos or comment on them, ensuring helpful and significant discussion from qualified participants. HCPs are also advised to notify their employers and patients they’re using the app, and both HCPs and patients have to sign a digital consent form before any content can be uploaded.

Figure 1 makes for an excellent medical educational tool for students and qualified HCPs alike. It’s about making useful, real-life medical images easily accessible in a digital community that cares. HCPs can now not only get a second opinion but a third, fourth, fifth, etc. It’s a big breakthrough in harnessing the power of digitisation and social media for medical benefit, something the industry is notoriously slow to pick up on.

So, surely a hub of shared medical expertise can only mean good news?

It would seem not. As with any digital sharing service there are  privacy concerns surrounding Figure 1, particularly as the data being handled is both medical and personal.

However, Figure 1 was actually borne out of security concerns for other sharing methods. Dr Landy saw that doctors and medical students were using smart phones and social media to share information about patients in way that didn’t protect patient privacy or store the records securely.

“Tens of thousands of times a day patient records and educational images are transferred from healthcare provider to healthcare provider,” Dr Landy says. “We were thinking of a way to try and preserve and protect that information in an archive that’s searchable and useful.”

So for Figure 1, security is paramount.

First things first – they safeguard patient privacy. All patient identities are obscured automatically by the app with face detection software, and HCPs can further obscure images if needed, for example to cover other identifying marks like tattoos. With each upload, users can choose whether to share with the entire community, a specific group or just one or two colleagues. Finally, all photographs have to go through a moderation process before they are published.

The company also operate a ‘no secrets’ policy. If you don’t keep any secrets you can’t lose them, so the app doesn’t store or access any patient records whatsoever.

However, online medical data breaches do happen, and it’s not like they’re anything new.  Just earlier this year saw the second largest medical data breach ever recorded by the US Department of Health and Human Services, where a network server hack resulted in 4.5 million individuals being affected. 2013 saw the third largest, which still affected over 4 million people.

However, these cyber-attacks are usually carried out to get patient information, like name, address, contact numbers, payment info, etc. – that can be used for fraud and identity theft. Figure 1 doesn’t store any personal information or anything useful to others though, unless you happen to really like pictures of skin diseases and x-rays.

The only data attached to the image is the user who uploaded it – the HCP. This one factor has still raised concern, GP and author Dr Ellie Cannon says that while she thinks “it’s potentially really useful to share photos with medical students and other doctors,” she feels that there’s an obvious “potential pitfall” in the confidentiality, and that despite patient anonymity, “uploading from a certain doctor may go some way to identify a patient.”

However, Dr Landy claims that “Legally, we found that identifying the doctor does not identify the patient.”

Given that some members of the public will happily expose their strange ailments for the world to see on Channel 4’s Embarrassing Bodies, this may not be a great concern to a lot of patients. Nevertheless, Figure 1 looks pretty fool-proof. The company maintain that patient privacy is as much a priority for them as it is for HCPs.

The Bottom Line

Figure 1 offers incredible potential for sharing knowledge and creating new bodies of information globally, all stemming from a simple photograph. As most uploads are typically more complex cases that call for outside input, there’s an abundance of interesting and rare cases that many HCPs might otherwise never see. Getting a global discussion going on a unique case could contribute to huge developments in various medical fields.

The app store page is full of gushing reviews touting its educational potential: “Hands down one of the best apps I have on my phone. App is well put together and there’s never a shortage of fascinating cases. I learn something new every time I’m using it!” Although a fair amount of users do warn not to peruse the feed whilst eating.

On the whole, Figure 1 could greatly contribute to the promising outlook developing in the unfolding healthtech revolution.

Old chart of anatomy

Google [x] and the Baseline Study

No, it’s not a new ‘hipster-esque’ band name, Google [x] is the research arm of Google, responsible for things like Google Glass, driver-less cars and Google Contact Lenses; the Baseline Study is their latest and greatest project – tackling one of the major mysteries of the human world, the human body.

Gray area?

Gray’s anatomy – considered to be one of the most influential books in the study of the human body, could be considered as the beta version of this particular Google project. Through this book, people could see anatomically correct drawings and explanations of parts of the human body. It was initially written as a cheaper and more accessible text book for students but is now (in its 40th edition) revered as not only a piece of history but still a very relevant (it’s been updated through the years) text book for students and professionals alike.

While this book gave a huge insight into how the human body looks on the inside, Google’s aim is to show us not just how the human body looks but also how it works for healthy humans and also for humans with different conditions and diseases, giving us a huge insight into just how the body is affected by these things.

Shoot for the moon

Hailed by some as one of Google’s most difficult projects ever undertaken, others see this project (known as a ‘moonshot’) as a step backwards for a company that is largely about the development of technology. However, the aim of this study is to become a prevention rather than a cure, that is, we should know what a body in perfect working order looks like so that indicators (or biomarkers) of potential problems can be found ahead of any likely issues and then treat those issues. “If we really wanted to be proactive, what would we need to know? You need to know what the fixed, well-running thing should look like.” says Dr Conrad, who is running the early stages of the project.

Ten things you should know about the Baseline Study:

  • They’re starting with a small test group of 175 people to begin with, later moving on to thousands of people.
  • The initial tests involve samples of blood, saliva, urine and tears.
  • When the pilot study is complete the main study will contain details like genetic history, how food is metabolised and how their heart beats under stress.
  • The study is completely anonymous, so no personal information will be attached to the body data.
  • This is not an overnight project, it will take a long time to get to anywhere near a complete stage.
  • It is likely that participants will wear the Google contact lens to measure blood glucose levels.
  • Google is developing wearable technology to complement this study so they can monitor the subjects effectively.
  • The same study failed ten years ago because the costs were too high, sequencing the human genome did cost $100 million but now comes in at around $1000 (bargain!).
  • This project isn’t about commercial gain (unlike other projects), but about Google’s main mission ‘organizing the world’s information and making it universally accessible and useful…’
  • A miniaturised man in a space ship is not going to be injected into the subjects, like in 1987 movie ‘Innerspace’ (although that would be kind of cool).

The collection and study of body data is not a new concept, and one that life sciences and pharma companies as well as hospitals and universities are chasing. The possibilities for research and development are huge and will inform best practice for treatment and management of conditions and hold a more complete picture of the body as a whole, not just the parts affected by illness.

Google can probably take a lot more for this project from the ‘self quantifiers’ than they may think; many seem keen to share their vital stats through apps and their health/sickness experiences through forums and blogs, so there are probably a lot of clues to indicators for conditions out there already. Regardless, this is an exciting project with endless possibilities and if Google don’t use this for commercial gain as they claim then maybe they aren’t a force for evil as some may think.

Self care in the digital age

Self-care in the digital age

Hosted by dallas (delivering assisted living lifestyles at scale) and The King’s Fund, Self-Care in the Digital Age, an event held at the end of June, brought together health care professionals and digital health innovators to discuss self-care and the future of digital innovation/integration.

The dallas programme – developed by the UK’s innovation agency, the Technology Strategy Board and joint funded by the National Institute for Health Research and the Scottish Government, tasked four groups with running a huge scale innovation programme and testing it in their communities throughout the UK.

The four groups funded by dallas

Year Zero – a group dedicated to developing personal health records for everyone and creating a suite of health and care planning tools for people with long term health conditions.
Living it Up – supports better health, well-being and active lifestyles in Scotland and creating personalised health care experiences while keeping people connected.
Mi (More Independent) – helping people in Liverpool to live more independent lifestyles using technology.
i-focus – supporting the other three groups with interoperability and best practice while also integrating home sensor technologies into older people’s homes to notify their relatives or friends if, for example, the temperature in the house gets too low, or if an appliance that would normally be used regularly (such as a kettle) doesn’t appear to have been used. It would also send messages to the friends and family to let them know that their loved one is up and about, and alright.

These groups also had the opportunity to talk about their work and any teething problems they’ve had in the development process.  The biggest challenge that Mi was faced with from their ‘consumers’ was that they were more concerned with safety and security than with their own health.  Meanwhile Living it Up have created community driven content, and have developed a platform with information that is important to them, not what an individual has decided is important to them, while Year Zero has brought together experts from media, technology, design and healthcare to develop person centred tools.

The conference

The format of the conference involved the assembled group of health care professionals and key opinion leaders being asked questions relating to self-care and digital technologies to generate a debate, talk about their success stories (or hindrances), ask questions of other members of the group and find out what work is already being done in the field. But not before watching this.

What they all found (including members of the audience) was that one of the barriers of self-care is that the general public sometimes think that tech is impersonal, they also find the lack of human interaction disconcerting, also that 17% of people don’t use technology because they’re scared of it but they don’t want to learn how to use it either.  Perhaps the people who would benefit most from this kind of technology are the one’s more resistant to it. The key thing to remember is that people buy outcomes not technology, so in order for it to work, they have to understand how it will help them, and in turn the developers of this tech and apps must also understand the needs of an individual, not just create a sparkly piece of tech.

Going back to the video of Terry, the overriding theme or question of the event was why wait until 2034 for technology to be used in self-care?  Why can’t it be used now? In many cases a lot of local authorities and NHS trusts are already experimenting with developments like these, mostly with great success. But it did lead me to question how ready the ‘current’ older generation are for technology like this, when you think of the current age of uptake of digital devices and apps and health trackers, they are probably the future ‘Terry’s’. Another concern express was what happens to the data, and who reviews the health data filters it and adds it to a patients notes, wouldn’t this drive costs up?  The answer, is to build it into the resource. By integrating it early on it will become a standard rather than an add on.

Design your old age

Right now we are being given the opportunity to redesign old age – maybe we won’t be so reliant on nursing homes and hospitals, because the older generation will have greater independence by making technology work for them.  Maybe being so connected will mean that people don’t wait weeks for a doctor’s appointment, but they get the diagnosis they need in a day, and get treatment faster meaning their condition doesn’t get worse. It’s important to remember that there are an increasingly growing number of individuals who have decided not to have children, so for this childless generation, who is going to take care of them? Being able to self-care is going to be vital in the future, otherwise they will be incredibly dependent on care banks that are over stretched even now.

The tech is already beginning to appear, for example, Cue is a device that essentially brings a hospital lab into your home, but takes up only 3 x 3 inches of space.  Using a swab to collect a sample, the user puts this into a specific cartridge and into the machine, which then relays the information to your mobile phone.  The further future plan is that information (in the case of flu as an example) is then easy to send on to your doctor, who can review the information and get a prescription ready for collection all at the few touches of a button. At the moment they only offer test cartridges for flu, inflammation, testosterone, fertility and vitamin D, but the plan is to have a much larger suite of tests available in the future.

A lot of the negative comments were from the price of development and implementation, however, cost effectiveness of apps is one thing but it left me questioning why can some trusts create apps and others not? Why is there no standardisation, or examples of trusts working together to save money and draw on larger scale data collection to improve services?  Although perhaps before data management is tackled in apps and self-care, surely the ability to transfer patient records efficiently should be top of agenda?

At the end of the conference we were asked ‘what are you going to do differently?’  The conference made me think a lot about what I could do to help expedite the process, improving self-care for future generations.  I know in my heart our current older population will never fully embrace technology (not all of them anyway), So surely rather than putting in ‘place holders’ for those who will never use the technology we already have, we develop what is already available and fulfil ideas so they reach their full potential, so people like ‘Terry’ have a far more user friendly, user driven and practical – not gimmicky – old age.

If you want to watch the conference click here.

Leeds Data Mill

Health in Numbers: Opening up our Body Data

Yesterday I jumped on the train to Leeds, not to catch the start of the Tour de France, but for something much more impactful – at least in terms of the future of healthcare.

Leeds Data Mill is an initiative that has been running for 4 months now, in which Leeds aim to become nationwide leaders in open data to the benefit of the city. When we heard their focus is currently on health, we thought we’d pay the north a visit.

Time for the lightning round!

I spoke during the lightning talks, in which you only have 10 minutes to deliver your story. Since we were talking about open data, I chose to discuss the vast amounts of data that everyday people create: body data. Anyone who wears an activity tracker, or uses their smartphone to track steps, sleep, food or many more other possibilities, is generating huge amounts of body data – often quite readily available in downloadable .csv files or through APIs.

Now, consider that the number of people using these devices is going to grow. Both Apple and Google are releasing health platforms, and as we all know, once Apple launch something, it tends to become ‘normal’ and go very mainstream. So we expect self-tracking to move from a minority user-base to the majority of everyday people. Imagine all this body data in one database. How powerful would that be for researchers, the NHS or even private companies that may be willing to pay for the data? With strict data privacy and clear licenses of ownership, it may well be possible to have the largest research sample in the world, via a crowdsourced network of self-trackers.

How can you get involved?

My talk ended with two ways that this is possible today. First, you can open up your body data right now under a Creative Commons Science License. Furthermore, you can easily share your data using the device or app’s own API through DataDonors.org (a WikiLife charity) so your body data can be used by researchers for good (and it will of course be anonymous).

In a world where people track their activity 24/7, and find it absolutely normal to do so, the possibilities for researchers are endless. It won’t be long before researchers are treated like rockstars, just like entrepreneurs on Kickstarter are today.