Data Science Portfolio

See how Lumilytics has helped pharmaceutical and biotech companies like yours achieve success with our data science solutions.

Read our case studies to learn more about our work and the impact we've had on our clients' businesses

PTM risk site identification through in silico protein structure modeling

The in silico methods of identifying three dimensional protein structure from primary sequence data is an active area of research with applications to drug discovery and development. Adapted recent advancements in structure prediction technology to PTM risk site identification and validation in biologics. As well as, developed enhanced metrics for scoring the accuracy of a predicted protein structure in the absence of experimental results.

Medical Device Modeling

Developed a solution for medical device modeling to assist stakeholders in choosing the right medication delivery device. This solution integrated machine learning algorithms with device specifications and patient data to provide personalized recommendations. By automating the selection process, the solution helped stakeholders make informed decisions, leading to improved patient outcomes.

NLP document classification

Employed natural language processing (NLP) techniques for document classification. This involved using algorithms to analyze the content of documents and categorize them based on their topics or themes. By automating the classification process, we were able to organize and manage large volumes of documents more efficiently, improving access to relevant information for users.

Image classification for the identification of sub visible particles

Developed a machine learning solution for image classification to identify sub-visible particles. This solution utilized convolutional neural networks (CNNs) to analyze images and accurately classify particles based on their characteristics. By automating the identification process, the solution improved the efficiency and accuracy of particle analysis in pharmaceutical research and manufacturing.

Data Visualization and Dashboard Creation

Implemented data visualization and dashboard creation techniques, utilizing tools such as Spotfire, Power BI, and Tableau, to create impactful data summaries for stakeholders. These visualizations effectively communicated complex data in a clear and concise manner, enabling stakeholders to quickly grasp key insights. By presenting data in visually appealing and informative ways, we enhanced decision-making processes and facilitated better understanding of the data among stakeholders.

Laboratory Automation

Successfully developed multiple methods for Hamilton liquid handlers to automate labor-intensive laboratory processes. These methods involved creating and optimizing liquid handling protocols to ensure accuracy, efficiency, and reproducibility. By automating these processes, we were able to significantly reduce the time and resources required for experimentation, leading to increased productivity and throughput.

AI data validation

Built a solution to verify text and validate data generated by AI tools. This solution utilized natural language processing (NLP) algorithms to detect errors, inconsistencies, and inaccuracies in the output. By providing a reliable method to validate AI-generated data, the solution ensured the accuracy and reliability of the information used for decision-making purposes.

Employment of large language models (LLMs) for summarization of regulatory documents

Utilized large language models (LLMs) for text summarization of regulatory documents. These models employed advanced natural language processing (NLP) techniques to generate concise summaries of complex regulatory texts. By leveraging LLMs, we were able to automate and streamline the summarization process, improving the efficiency of regulatory document analysis.

Formulation Optimization

Developed a successful predictive modeling solution used to predict the optimal formulation for biologics, including the ideal combination of buffers and excipients. This solution incorporated advanced machine learning algorithms and domain-specific knowledge to analyze data and recommend the most effective formulations. By accurately predicting the optimal formulation, the solution helped streamline the development process of biologics, leading to cost savings and improved outcomes in pharmaceutical research.

Automated Data Analysis

Developed multiple applications for stakeholders to automate the data analysis needed for analyzing experimental pharmaceutical data. These applications utilized machine learning and statistical algorithms to process large datasets efficiently and extract meaningful insights. By automating the analysis process, these applications helped pharmaceutical researchers save time and resources, enabling faster and more informed decision-making in drug discovery and development.

Multimodal Data Processing

Advancing NLP technology beyond just text-based inputs was the driving force behind the development of applications for multimodal data processing, including converting tabular data into text summaries, as well as graphical and image data into text summaries. These applications utilized advanced algorithms and machine learning models to extract key insights and generate concise summaries. The goal was to provide users with a comprehensive understanding of their data across different modalities, enhancing data interpretation and decision-making processes.

Chat with your data

Creating an application that allowed users to use a chatbot feature to examine their data involved developing a user-friendly interface and integrating natural language processing capabilities. The application enabled users to interact with their data through conversational queries, making data exploration more intuitive and accessible. This approach provided users with a novel and efficient way to analyze and derive insights from their data.

Community of Practice Initiation

Starting a data science community of practice involved creating a forum where data science enthusiasts, practitioners, and experts could come together to share knowledge, best practices, and experiences. This community fostered collaboration, learning, and innovation in the field of data science, ultimately helping members grow their skills and solve complex problems more effectively. By building such a community, individuals were able to stay updated with the latest trends, tools, and techniques in data science, enhancing their professional development.