Fix descriptions to adhere to character limits

This commit is contained in:
Timothy Kassis
2025-10-21 09:33:30 -07:00
parent 163d6b9d6b
commit 32109101d0
83 changed files with 83 additions and 83 deletions

View File

@@ -1,6 +1,6 @@
---
name: benchling-integration
description: "Toolkit for integrating with Benchling's R&D platform, providing programmatic access to laboratory data management including registry entities (DNA sequences, proteins), inventory systems (samples, containers, locations), electronic lab notebooks (entries, protocols), workflows (tasks, automation), and data exports. Use this skill when working with Benchling APIs, managing biological sequences and samples, automating lab workflows, syncing data between Benchling and external systems, building Benchling Apps, or querying the Benchling Data Warehouse for analytics."
description: "Benchling R&D platform integration. Access registry (DNA, proteins), inventory, ELN entries, workflows via API, build Benchling Apps, query Data Warehouse, for lab data management automation."
---
# Benchling Integration

View File

@@ -1,6 +1,6 @@
---
name: dnanexus-integration
description: Comprehensive toolkit for working with the DNAnexus cloud platform for genomics and biomedical data analysis. Use this skill when users need to build apps/applets, manage data (upload/download files, create records, search data objects), run analyses and workflows, use the dxpy Python SDK, or configure app metadata and dependencies. This applies to tasks involving DNAnexus projects, jobs, data objects (files/records/databases), FASTQ/BAM/VCF files on DNAnexus, bioinformatics pipelines, genomics workflows, or any interaction with the DNAnexus API or command-line tools. The skill covers app development (Python/Bash), data operations, job execution, workflow orchestration, and platform configuration including dxapp.json setup and dependency management (system packages, Docker, assets).
description: "DNAnexus cloud genomics platform. Build apps/applets, manage data (upload/download), dxpy Python SDK, run workflows, FASTQ/BAM/VCF, for genomics pipeline development and execution."
---
# DNAnexus Integration

View File

@@ -1,6 +1,6 @@
---
name: labarchive-integration
description: Toolkit for interacting with LabArchives Electronic Lab Notebook (ELN) API. This skill should be used when working with LabArchives notebooks, including authentication setup, retrieving user and notebook information, backing up notebooks, managing entries and attachments, generating reports, or integrating LabArchives with other scientific tools (Protocols.io, GraphPad Prism, SnapGene, Geneious, Jupyter, REDCap). Use this skill for any task involving programmatic access to LabArchives data or automating LabArchives workflows.
description: "LabArchives ELN API integration. Access notebooks, manage entries/attachments, backup notebooks, integrate with Protocols.io/Jupyter/REDCap, for programmatic ELN workflows."
---
# LabArchives Integration

View File

@@ -1,6 +1,6 @@
---
name: latchbio-integration
description: Integration with the Latch platform for building, deploying, and executing bioinformatics workflows. This skill should be used when working with the Latch SDK, creating serverless bioinformatics pipelines, deploying workflows to Latch, managing data in Latch's cloud storage (LatchFile, LatchDir) or Registry system, configuring computational resources (CPU, GPU, memory) for tasks, using pre-built Latch Verified workflows (RNA-seq, AlphaFold, DESeq2, etc.), or integrating Nextflow/Snakemake pipelines with the Latch platform. Use this skill for tasks involving "latch register", "latch init", workflow decoration with @workflow and @task decorators, or when users mention deploying to or working with the Latch platform.
description: "Latch platform for bioinformatics workflows. Build pipelines with Latch SDK, @workflow/@task decorators, deploy serverless workflows, LatchFile/LatchDir, Nextflow/Snakemake integration."
---
# LatchBio Integration

View File

@@ -1,6 +1,6 @@
---
name: omero-integration
description: Toolkit for interacting with OMERO microscopy data management systems using Python. Use this skill when working with microscopy images stored in OMERO servers, retrieving datasets and screening data, analyzing pixel data from scientific images, creating or managing annotations and metadata, working with regions of interest (ROIs), batch processing images, creating OMERO scripts, or integrating OMERO data into computational workflows. Essential for researchers working with high-content screening data, multi-dimensional microscopy datasets, or collaborative image repositories.
description: "OMERO microscopy data management. Access images via Python, retrieve datasets, analyze pixels, manage ROIs/annotations, batch processing, for high-content screening and microscopy workflows."
---
# OMERO Integration

View File

@@ -1,6 +1,6 @@
---
name: opentrons-integration
description: Toolkit for creating, editing, and debugging Opentrons Python Protocol API v2 protocols for laboratory automation. This skill should be used when working with Opentrons Flex or OT-2 robots, writing liquid handling protocols, automating pipetting tasks, controlling hardware modules (heater-shaker, temperature, magnetic, thermocycler, absorbance plate reader), managing labware and deck layouts, or performing any laboratory automation tasks using the Opentrons platform. Use this skill for protocol development, troubleshooting, simulation, and optimizing automated workflows for biological and chemical experiments.
description: "Opentrons lab automation. Write Protocol API v2 protocols for Flex/OT-2 robots, liquid handling, hardware modules (heater-shaker, thermocycler), labware management, for automated pipetting workflows."
---
# Opentrons Integration