Collaboratory Artificial Intelligence (AI) FAQ

This FAQ provides detailed information on Collaboratory’s suite of AI-powered features, offering transparency into how these tools are built, how they function, and how your institution can utilize them responsibly. Collaboratory is leveraging artificial intelligence to streamline data entry, generate actionable insights, and simplify complex reporting workflows. This approach is designed to reduce administrative burden while empowering leaders to make strategic, data-informed decisions that enhance the impact of their institution’s external partnerships.

Collaboratory's Suite of AI Tools

What AI tools are currently available in Collaboratory?

  • SmartFill: Extracts relevant data from URLs to draft activity forms. Users review and finalize submissions.
  • SmartSort: Built into the SmartFill process, SmartSort categorizes activities as either "community engagement" or "public service," using logic informed by higher education literature and best practices.

What features are currently in development?

  • SmartFind: Keeps a finger on the pulse of public-facing university activity, continuously scanning decentralized sources to surface engagement and service work that might otherwise go unreported — no manual effort required.
  • SmartLens: Analyzes engagement captured in Collaboratory to highlight institutional stories, identify collaborators with shared interests, surface trends, and support strategic planning and stakeholder reporting.

Who is Collaboratory's AI service provider?

Our current AI partner is OpenAI, and we may work with additional partners in the future.

Availability + Controls

How is access to AI features structured?

Institutions that joined Collaboratory before December 31, 2025, receive complimentary access to SmartFill AI. For institutions that joined after this date, AI features are included within specific subscription tiers. As we develop advanced AI capabilities, some may be introduced as paid add-ons or tiered offerings to reflect the ongoing costs of maintaining and running AI systems at scale.

Can my institution opt out of using AI?

Yes. Institutions can opt out of AI tools campus-wide at any time by contacting Collaboratory Support. Additionally, individual users are always informed when they are interacting with AI features and can choose to use traditional, non-AI workflows instead.

Are there usage limits for AI features?

There are no usage limits at this time as we evaluate typical usage patterns. However, we reserve the right to implement limits in the future to ensure system stability and fair access.

Data Privacy + Compliance

Is institutional data used to train third-party AI models?

No. Customer data (including URLs, text snippets, and activity records) is never used to train OpenAI’s models or any other third-party AI models. Your institutional data remains yours.

How is data handled by third-party providers like OpenAI?

OpenAI temporarily retains inputs and outputs for up to 30 days for the sole purpose of monitoring for abuse and misuse. After this 30-day period, the data is deleted.

How does Collaboratory store AI-related data?

We do not store raw AI prompts or outputs in a separate database. We only store the final data that the user reviews and accepts as part of the standard activity record, governed by our existing privacy policies. Our proprietary "logic" and prompt designs are protected and are not exposed to the AI model's general knowledge base.

Who is responsible for compliance with privacy laws like HIPAA or FERPA?

Institutions remain responsible for ensuring that their content complies with HIPAA, FERPA, and other internal policies. All AI use must align with Collaboratory’s acceptable use and privacy policy.

Responsible AI

Does the AI make decisions without human input?

No. A core tenet of our AI principles is human oversight. The AI offers suggestions and drafts, but users must review, adjust, and approve all outputs before they are finalized and published.

How do you prevent "hallucinations" or incorrect data?

SmartFill AI is designed for high accuracy; it only extracts information explicitly found within the source URLs provided. If information is missing from the source, the AI will not attempt to guess or "hallucinate" details, it will simply leave those fields blank for the user to complete.

Is the AI interface accessible?

Yes. All Collaboratory AI features are delivered within an ADA-compliant interface to ensure they are accessible to all users.

What is the environmental impact of using these features?

We encourage thoughtful use of AI. Each SmartFill activity consumes an estimated 0.4–1.7 watt-hours of electricity (equivalent to an LED bulb running for 5–15 minutes). While the per-use impact is small, we believe in transparency regarding the carbon footprint of digital tools.

Feature-Specific Functionality

Can SmartFill AI update existing records?

Currently, SmartFill AI is designed to assist in the creation of new activities. We are considering future enhancements that would allow AI to assist with ongoing content adjustments and edits.

Can I upload a CSV to create multiple activities at once with AI?

At this time, one URL (or set of URLs) generates one activity. We are actively exploring "many-to-many" relationships, such as processing a CSV file to generate multiple activities simultaneously.

Why is my activity form only partially filled by SmartFill?

The quality of the draft depends entirely on the source URL. It is unlikely that 100% of a record will be autofilled because the AI will not invent data that isn't explicitly mentioned in the source. The user is always prompted to complete the remaining information.

How does SmartSort know how to categorize my work?

While manual entries use a decision tree, SmartSort uses prompts guided by higher education literature to distinguish between engagement and service. This is an active area of scholarship for our team, and we continue to iterate on these capabilities. Note that Administrators can always manually change a categorization if they disagree with the AI's suggestion.

See Collaboratory in Action