TauchiGPT_V2: An Offline Agent-based Opensource AI Tool designed to Assist in Academic Research

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

24 Downloads (Pure)

Abstract

Recent progress in artificial intelligence, particularly deep learning, has ushered in a new era of autonomously generated content spanning text, audio, and visuals. This means Large Language Models (LLMs) such as ChatGPT, Llama2, Claude, and PaLM 2 are now developed enough to not only fill in the gaps within user-generated content, but also create unique content of their own, using predefined styles, formats, and writing techniques. With selective modelling and fine-tuning relevant training data, LLMs can output original content for a wide range of tasks previously considered solely the domain of human creativity. However, if we look at the area of research and development within academics, this AI renaissance has yet to make a meaningful impact finding in the pedagogical domains. Crafting a tailored R&D instrument, adept at intricate research procedures, previously presented a formidable challenge regarding expertise, time, and fiscal resources. However, the latest development Within this context, Generative Pre-trained Transformers (GPT) and their foundational structures offer a beacon, given their potential to exploit pre-trained Large Language Models (LLMs) for optimizing standard research operations. Our previous work on Autonomous Agents shows that using existing tools and deductive reasoning techniques built on the LangChain model can create a customized tool for academic research. This study builds on the existing work in autonomous agents and open-source LLMs to develop TAUCHI-GPT_V2, a novel adaptation of the academic research assistant. TAUCHI-GPT_V2, conceptualized as an open-source initiative, is built on top of the LangChain architecture employing LLaMA2-13b as the core LLM, ingesting users’ own data and files to provide highly relevant contextual results. In this paper, we discuss how TAUCHI-GPT_V2 uses custom offline localized vectorDB for parsing users’ personal files to output relevant contextual results within a chat interface. We also put the model to the test by having academic researchers utilize the tool within their daily workflow and report its efficacy and reliability in both hallucinations as well as citing relevant information to enhance user workflow for academic research-related tasks.
Original languageEnglish
Title of host publicationHuman Interaction and Emerging Technologies (IHIET-AI 2024)
Subtitle of host publicationArtificial Intelligence and Future Applications
EditorsTareq Ahram, Redha Taiar
Place of PublicationUSA
PublisherAHFE International
Pages172-185
Number of pages14
Volume120
ISBN (Electronic)978-1-958651-96-4
DOIs
Publication statusPublished - 2024
Publication typeA4 Article in conference proceedings
EventInternational Conference on Human Interaction and Emerging Technologies - Lausanne, Switzerland
Duration: 25 Apr 202427 Apr 2024

Publication series

NameAHFE international
ISSN (Electronic)2771-0718

Conference

ConferenceInternational Conference on Human Interaction and Emerging Technologies
Country/TerritorySwitzerland
CityLausanne
Period25/04/2427/04/24

Keywords

  • Artificial Intelligence
  • Large Language Models
  • Generative Pre-trained Transformers
  • Human-Computer Interaction
  • Opensource LLM Models

Publication forum classification

  • Publication forum level 1

Fingerprint

Dive into the research topics of 'TauchiGPT_V2: An Offline Agent-based Opensource AI Tool designed to Assist in Academic Research'. Together they form a unique fingerprint.

Cite this