Rob Harkness | CTO

Rob Harkness, CTO, Biosero

Rob Harkness, PhD, is the Chief Technology Officer at Biosero, a BICO company, where he previously served as Managing Director for the UK/EU business. Since joining Biosero in 2020, Rob has been actively involved in shaping the company’s technology roadmap and focused on advancing the capabilities of laboratory automation to facilitate scientific discoveries.  
Prior to Biosero, Rob held key roles at leading automation companies like CyBio, Astech Projects, and Peak Analysis & Automation. Rob also serves as a director for the Standards in Laboratory Automation (SiLA) consortium, promoting open standards to enhance lab automation across the life sciences. 
Rob earned his BSc in Computer-Aided Chemistry and his PhD in Biochemical Assay Automation Software from the University of Surrey, grounding his work in a deep understanding of both the computational and biochemical aspects of lab automation. 


Day 1 (26th June) @ 14:45

Preparing for laboratory automation's next wave: The crucial role of a 'no data left behind' approach

As the landscape of Laboratory Automation evolves, one of the most transformative trends is going to be the incorporation of Artificial Intelligence (AI) and Machine Learning (ML) into automated workflows. This evolution stands to enhance lab operations with improved efficiency, reliability, and analytical depth. Traditional automation excels in routine task management but often fails to adapt to new data sets or changing conditions. The adoption of AI and ML is poised to fill this gap, offering labs the ability to not just complete tasks, but also to refine and evolve workflows with incoming data streams.

The shift toward this innovative approach calls for a comprehensive and considered strategy to capture and utilize the full spectrum of laboratory data, including environmental conditions, user interactions, instrument performance, and the intricate specifics of samples and experimental results. It's imperative to document the what, why, who, as well as the where and when—you must ensure that no data is left behind.


In this presentation, we will discuss the practical steps towards creating such a nuanced system and the opportunities it may present. We'll consider a future where intelligent algorithms could manage routine sample categorization and data analysis, potentially freeing up scientific personnel to focus on the more complex aspects of research and discovery.


Achieving this level of integration involves leveraging centralized data services, like GBG Data Services, which streamline the consolidation of operational data into a singular, accessible repository. This centralization supports informed decision-making and fosters connectivity across various lab instruments and systems, thus optimizing workflow efficiency. GBG Data Services acts as a comprehensive hub, systematically recording, storing, and disseminating data, facilitating AI-driven optimization and detailed reporting while maintaining compatibility with diverse management systems such as LIMS and ELN.

This initiative towards a data-rich, AI and ML-integrated environment is one that promises to enhance the capabilities of any lab. By adhering to a 'no data left behind' ethos, we lay the groundwork for the intelligent, evolving laboratories of the future.

last published: 17/Jun/24 14:15 GMT

back to speakers



For conference production and speaking opportunites:

Für die Produktion von Konferenzen und die Möglichkeit, Vorträge zu halten: 

For sponsorship and exhibition opportunities: