Teacher Prep

Christiaan Verhoef

2025-02-20

Teacher prep

AI as a Research assistant

🛠 Meta Goal: Awareness - Understanding how AI can assist in research without replacing critical thinking.
👥 Target: Students and Researchers in AI and data-driven research.
🔗 Anchor: AI as a research assistant – not a replacement for human analysis.
📖 Metaphor: AI research validation is like working with an overenthusiastic intern – helpful, but always double-check their work.
📜 Story: A student is using OpenWebUI to summarize a research paper. The AI generates a concise summary, but after comparing it with the actual abstract, they realize key details are missing or misrepresented. This prompts the student to refine their AI prompts and verify AI outputs using traditional research methods.

Teaching Tools & Instruments Used:

  • 🤖 OpenWebUI – AI-powered chatbot for summarizing research papers.
  • 📜 Sample research papers – Short academic articles for hands-on AI summarization.
  • 📝 Prompt refinement worksheet – Helps participants improve AI-generated responses.
  • 🔍 Fact-checking rubric – A checklist to assess AI-generated summaries.

Module Details

  • Step 1: AI Summarization Exercise – Participants input a research abstract into OpenWebUI and receive an AI-generated summary.
  • Step 2: Comparison & Analysis – Small groups discuss how accurate or misleading the AI-generated summary is.
  • Step 3: Prompt Refinement – Participants use a guided worksheet to adjust prompts and observe improvements in AI responses.
  • Step 4: Live AI Debugging – Instructor showcases real-time adjustments to prompts and their impact.
  • Participants start with an AI-assisted research exercise, where they input an abstract into OpenWebUI and compare its AI-generated summary with the original.
  • Small group discussions analyze how accurate or misleading the AI’s summary is.
  • Participants refine prompts using a guided worksheet to improve AI-generated responses.
  • Live AI debugging session: Instructor showcases real-time adjustments to prompts and their impact., comparing AI-generated summaries with real research abstracts.
  • Group discussion on where AI fails and how to improve prompts.
  • How can AI-generated summaries misrepresent research findings?
  • What techniques improve AI-generated responses for better accuracy?
  • How can researchers ensure AI does not introduce biases in their work?
  • What are the benefits and risks of using AI for research?
  • How can AI-generated summaries misrepresent research findings?
  • What techniques improve AI-generated responses for better accuracy?
  • How can researchers ensure AI does not introduce biases in their work?
  • How can AI assist in research summarization?
  • What are the limitations of AI-generated research summaries?
  • How can AI responses be improved through prompt refinement?
  • Participants gain practical skills in AI-assisted research evaluation.
  • They develop critical thinking to assess AI-generated content.
  • They leave with a structured AI fact-checking workflow they can apply to their research.
  • Participants learn how to evaluate AI-generated research critically.
  • They gain practical skills in prompt engineering to refine AI responses.
  • Participants leave with a structured AI fact-checking workflow they can apply in real research.
  • Understanding AI limitations helps them become more responsible AI users.
  • Participants learn to use AI efficiently and critically.
  • They gain hands-on experience in prompt crafting to improve AI results.
  • “Break the AI Challenge” – Participants compete to find the most misleading AI-generated summary.
  • “AI vs. Human Quiz” – Participants try to identify AI-written vs. human-written summaries.
  • “Fix the AI” – Groups attempt to refine prompts to get the most accurate AI response.
  • “Break the AI Challenge” – Participants try to generate the most misleading AI summary, highlighting AI’s tendency to over-simplify.
  • AI vs. Human Quiz – Participants compare human-written vs. AI-generated summaries and guess which is which.
  • Group voting on the “Most Convincing Yet Inaccurate” AI response, reinforcing critical AI literacy.
  • Break the AI Challenge” – Participants try to find flaws in AI-generated summaries and compete to create the most misleading AI response.
  • The group votes on the most hilariously incorrect AI summary to demonstrate why AI should always be fact-checked.
  • AI as a research tool, not an authority – The importance of human verification.
  • Prompt engineering techniques – How to adjust research queries for improved AI-generated results.
  • Understanding AI bias – Why AI summaries may be misleading and how to fact-check effectively.
  • AI is a research assistant, not an authority – human verification is necessary.
  • Prompt refinement techniques – How to phrase research queries for the best AI responses.
  • How AI can introduce bias and ways to fact-check AI-generated content.
  • Hands-on research validation: Using AI alongside traditional search engines for cross-checking.
  • AI is a tool, not an authority – critical thinking is required.
  • How to refine prompts to get better AI-generated responses.
  • AI bias awareness – Understanding when and why AI may generate misleading summaries.

AI research Questions

By the end of this session, you will: ✅ Learn how AI can generate new research questions based on existing knowledge.
✅ Understand how to refine AI-generated questions into strong, focused research topics.
✅ Discover how AI can inspire human creativity without replacing critical thinking.

💡 Key Takeaway: AI is a powerful brainstorming tool, but human researchers guide its direction! 🚀

A.I to Create Research questions

🛠 Meta Goal: Awareness - Understanding how AI can enhance, not replace, research ideation.
👥 Target: Researchers, Students, and Academics.
🔗 Anchor: AI is a collaborator, not a creator—it helps structure ideas, but humans define relevance.
📖 Metaphor: Brainstorming with AI is like using a whiteboard full of post-it notes—AI generates ideas, but humans refine and structure them.
📜 Story: A researcher is struggling to develop an impactful research question for a thesis. By using OpenWebUI, they generate AI-assisted question ideas, refine them for clarity, and validate them with existing research. The AI speeds up brainstorming, but the researcher ensures relevance and feasibility.

Teaching Tools & Instruments Used:

  • 🤖 OpenWebUI – AI-powered brainstorming assistant.
  • 📝 Research Question Refinement Guide – Helps improve AI-generated questions.
  • 📚 Example Research Topics – Used to test AI brainstorming capabilities.
  • 🔍 Peer Review Discussion – Participants evaluate AI-generated questions.

Module Details

  • Participants ask OpenWebUI to generate 5 research questions on a selected topic.
  • They pick the strongest AI-generated question and refine it using a structured worksheet.
  • Participants peer-review each other’s refined questions and discuss their relevance.
  • How can AI assist in formulating strong research questions?
  • What techniques help refine broad AI-generated ideas into specific research topics?
  • How does AI complement human creativity in academic research?

📌 Step 1: Open OpenWebUI and ask AI to generate 5 research questions on a topic.
📌 Step 2: Identify the best AI-generated question and refine it for clarity & focus.
📌 Step 3: Compare AI questions to existing academic literature—is it relevant?
📌 Step 4: Share refined questions with peers for feedback & discussion.

💡 Lesson: AI helps generate ideas, but researchers must ensure feasibility and significance!

  • Participants enhance their research ideation skills using AI.
  • They learn how to critically evaluate AI-generated content.
  • They leave with at least one strong research question for their field.
  • “Best AI Research Question” Contest – Participants vote on the most insightful AI-generated question.
  • Reverse Engineer Challenge – Participants start with a strong research question and try to make AI generate it.
  • AI vs. Human Creativity Game – Teams compare AI-generated vs. human-crafted questions and guess which is which.
  • How AI generates research questions – Understanding AI’s pattern recognition abilities.
  • How to refine AI-generated content – Making questions more focused & impactful.
  • Using AI without over-relying on it – Balancing AI assistance with human expertise.
  • How peer review strengthens research questions – Collaborative refinement process.

Automation of Research

🛠 Meta Goal: Realization - Understanding how automation can streamline research workflows.
👥 Target: Students and Researchers who handle large volumes of research data.
🔗 Anchor: Automation as a research assistant – reducing manual workload.
📖 Metaphor: Automating research is like having a virtual research assistant that fetches, organizes, and summarizes papers for you.
📜 Story: Meet Alex, a supply chain researcher drowning in logistics reports, academic papers, and real-time shipping data. Every morning, Alex manually searches arXiv for new research, downloads papers, and skims them for relevant insights on predictive analytics in supply chains. One day, Alex spills coffee on their notes and realizes there must be a better way. Enter n8n—an automation tool that fetches logistics research, summarizes findings, and neatly organizes everything into Notion before Alex even wakes up. With n8n, Alex automates:

  1. Fetching new papers from arXiv automatically.
  2. Using OpenWebUI to summarize key insights.
  3. Storing structured findings in Notion for quick reference.
  4. Scheduling weekly updates so new research is always accessible

Teaching Tools & Instruments Used:

  • 🔄 n8n – Visual automation tool for workflow creation.
  • 📑 arXiv API – Source for fetching research papers automatically.
  • 🤖 OpenWebUI – AI-powered tool for summarizing fetched research papers.
  • 📝 Automation Workflow Guide – Step-by-step worksheet for building workflows in n8n.
  • 📊 Google Docs/Notion – Storing AI-processed summaries for easy reference.

Module Details

  • Step 1: Introduction to Automation – Instructor demonstrates how research workflows can be automated.
  • Step 2: Hands-on Setup – Participants build a simple automation in n8n to fetch research papers from arXiv.
  • Step 3: AI Summarization – Use OpenWebUI to summarize the fetched papers.
  • Step 4: Storing Results – Store summarized findings in Google Docs/Notion for organized research.
  • Step 5: Group Discussion – How can automation be further integrated into research?
  • How can automation improve research efficiency?
  • What repetitive research tasks can be automated?
  • How can AI tools assist in organizing and summarizing research findings?
  • Hands-on Research Automation Build:
    1. Set up an n8n workflow to fetch supply chain research papers.
    2. Integrate OpenWebUI to generate summaries for fast insights.
    3. Store outputs in Notion/Google Docs, creating a structured knowledge base.
    4. Schedule recurring automations so research updates happen effortlessly.
  • Skill Development:
    • How to integrate APIs (arXiv for research, OpenWebUI for AI summarization).
    • How to connect AI-driven research workflows for efficiency.
    • How to troubleshoot common automation issues.
  • Supply Chain Applications:
    • Automating supplier risk analysis by tracking academic studies.
    • Creating live updates on logistics disruptions.
    • Streamlining research workflows to save hours of manual effort.
  • Building a Real-World Research Automation: Participants will set up an automated workflow in n8n to:
    1. Fetch research papers from the arXiv API based on a specific topic.
    2. Send the data to OpenWebUI, where the AI will generate a summary of each paper.
    3. Store the summarized findings in Google Docs or Notion for easy reference and collaboration.
    4. Schedule recurring tasks so that new research papers are automatically retrieved and processed every week.
  • Skill Development:
    • Understanding API integration for research automation.
    • Connecting AI tools to enhance workflow efficiency.
    • Implementing and troubleshooting automated systems.
  • Application in Logistics & Supply Chain:
    • Automating tracking of real-time disruptions in logistics.
    • Organizing supplier risk reports.
    • Enhancing predictive forecasting analysis.
  • Participants build a working automation for real-world research.
  • They gain hands-on experience in connecting AI and automation tools for research efficiency.
  • They leave with a custom workflow that fetches, summarizes, and organizes supply chain research automatically.
  • They understand how automation reduces manual workload, making research more efficient and scalable.
  • They see how AI-powered automation can be applied in logistics analytics to track real-time disruptions, supplier risks, and predictive forecasting trends.
  • Participants gain hands-on experience in building customized automation workflows.
  • They learn how to integrate AI tools into research processes.
  • They leave with a functional automated system they can refine and expand.
  • Automation Race: Participants compete to build the fastest working automation workflow for supply chain research.
  • Troubleshooting Challenge: Given a broken workflow, participants debug errors to make it functional.
  • Showcase & Discussion: Groups present their workflows, explain improvements, and discuss real-world applications.
  • Automation Race: Each participant builds an n8n workflow to fetch and summarize real-time logistics research. The first working setup wins.
  • “What Went Wrong?” Challenge: Participants troubleshoot a pre-made broken automation workflow, learning debugging skills along the way.
  • Workflow Showcase: Each group presents their automation—how they built it, what problem it solves, and how it could be expanded.
  • Automation Race – Participants compete to build the fastest working research automation.
  • “What Went Wrong?” Challenge – Debugging automation failures in groups.
  • Workflow Showcase – Participants present creative automation ideas.
  • Why automate research? Introduction to research automation benefits, focusing on supply chain applications.
  • Step-by-Step Workflow Setup:
    1. Setting up n8n and creating a workflow.
    2. Connecting APIs (arXiv, OpenWebUI) to automate data retrieval and processing.
    3. Using OpenWebUI for summarization to extract key insights.
    4. Saving structured outputs to Notion/Google Docs for easy reference.
    5. Scheduling tasks for continuous research updates.
  • Debugging & Optimization: Recognizing common automation pitfalls and refining workflows for accuracy and reliability.
  • Introduction to Research Automation: Why repetitive research tasks can be automated and how it helps in logistics & supply chain analysis.
  • Building an n8n Workflow Step-by-Step:
    1. Creating a workflow in n8n.
    2. Connecting the arXiv API to fetch research data.
    3. Integrating OpenWebUI for AI-generated summaries.
    4. Sending results to Google Docs/Notion.
    5. Scheduling automation for continuous updates.
  • Common Pitfalls & How to Fix Them: Recognizing when automation goes wrong and best practices for keeping workflows efficient and accurate.
  • Breaking down research automation – Key concepts & best practices.
  • Building step-by-step workflows in n8n.
  • How AI tools assist research – Summarization & organization.
  • Common automation pitfalls and how to fix them.