Determinantal Point Processes for Prompt Engineering for LLMs

Type of thesis: Masterarbeit / location: Dresden / Status of thesis: Open theses

The performance of a large language model (LLM) is sensitive to the way it is prompted. Automated prompt engineering methods aim to find suitable prompts for a given task by sampling several prompts and evaluating them. Existing automatic prompt engineering methods do not generate sufficiently diverse sample prompts or rely on several meta-prompting tricks to achieve the desired results. In this thesis, we will use a method for prompt selection to directly optimise diversity and estimated performance by exploiting so called determinental point processes. The thesis will involve comparisons of this technique to state-of-the-art prompt engineering methods such as PromptBreeder from DeepMind.

Requirements:

  • Excellent and long standing interest and knowledge in mathematics
  • Good programming skills in Python and PyTorch (optional)

Counterpart

Dr. Sahar Vahdati

TU Dresden

Nature-Inspired Machine Intelligence

TU
Universität
Max
Leibnitz-Institut
Helmholtz
Hemholtz
Institut
Fraunhofer-Institut
Fraunhofer-Institut
Max-Planck-Institut
Institute
Max-Plank-Institut