Meet PriomptiPy: A Python Library to Budget Tokens and Dynamically Render Prompts for LLMs

Meet PriomptiPy: A Python Library to Budget Tokens and Dynamically Render Prompts for LLMs

In a significant stride towards advancing Python-based conversational AI development, the Quarkle development team recently unveiled “PriomptiPy,” a Python implementation of Cursor’s innovative Priompt library. This release marks a pivotal moment for developers as it extends the cutting-edge features of Cursor’s stack to all large language model (LLM) applications, including the popular Quarkle.

PriomptiPy, a fusion of “priority,” “prompt,” and “python,” is a powerful prompting library designed to streamline the complex task of token budgeting. Managing conversations with extensive context, which includes book excerpts, summaries, instructions, conversation history, and more, can easily escalate to 8-10K tokens. With the integration of PriomptiPy, the Quarkle team aims to provide developers with a tool that empowers them to build robust AI systems without drowning in a sea of if/else statements or inflating their AI bills.

The journey towards PriomptiPy began when the Quarkle team encountered a challenge – their WebSockets ran in Python, preventing them from leveraging the promising Priompt library. Undeterred, they took matters into their own hands and diligently adapted Priompt to Python, ensuring seamless integration with their existing infrastructure.

PriomptiPy mirrors the structure of Priompt, although it acknowledges that it is not as exhaustive or potent yet. However, it is a promising start for developers eager to harness the capabilities of prioritized prompting in their Python applications. The library introduces priority-based context management, invaluable in AI-enabled agent and chatbot development.


To illustrate its functionality, the Quarkle team provides a scenario where a conversation is managed using PriomptiPy. The code snippet showcases the use of different message types, including SystemMessage, UserMessage, and AssistantMessage, within a structured conversation. Including Scope allows prioritization, ensuring that the most relevant messages are considered within the token limit. PriomptiPy operates on prioritized content rendering and dynamically managing conversation flow – a critical aspect, especially when token space is limited.

The library introduces logical components, including Scope, Empty, Isolate, First, Capture, SystemMessage, UserMessage, AssistantMessage, and Function, each serving a specific purpose in constructing prompts for AI models. While PriomptiPy enhances prompt management, the Quarkle team emphasizes carefully considering priorities to maintain efficient and cache-friendly prompts.

Acknowledging some caveats, PriomptiPy does not yet support runnable function calling and capturing, features that are on the roadmap for future development. Cacheing remains a challenge that the team is eager to address with community support. The Quarkle team welcomes contributions to PriomptiPy, fostering an open-source community under the MIT license.

Niharika is a Technical consulting intern at Marktechpost. She is a third year undergraduate, currently pursuing her B.Tech from Indian Institute of Technology(IIT), Kharagpur. She is a highly enthusiastic individual with a keen interest in Machine learning, Data science and AI and an avid reader of the latest developments in these fields.

🧑‍💻 [FREE AI WEBINAR] ‚Build Real-Time Document/Image Analytics with GPT-4 Vision‘ (Jan 29, 2024)

Source link


Be the first to comment

Leave a Reply

Your email address will not be published.