This is my secret to perfect answers from Gemini

3 weeks ago 11 Back

Brady is a technology journalist for MakeUseOf with years of experience covering all things mobile, computing, and general tech. He has a focus on Android phones and audio gear, and holds a B.S. in Journalism from St. John's University.

Brady has written for publications like Android Central, Android Authority, XDA, Android Police, iMore, and others. He has experience reporting on major events held by Google, Apple, and Samsung, as well as trade shows like Lenovo Innovation World and IFA. 

When he's not writing about and testing the latest gadgets, you'll find Brady watching Big East basketball and running. 

By now, you've probably written your fair share of prompts for AI chatbots like Gemini or ChatGPT. With powerful large language models (LLMs) powering these chatbots behind the scenes, it doesn't take much instruction for Gemini or ChatGPT to produce a decent output. However, the quality of the prompt you provide an AI model directly correlates to the quality of the output you receive. If you want better responses from your AI chatbots, you need to give them better prompts—it's a science.

To level up your prompts, you'll want to start prompt engineering, which involves optimizing and crafting LLM inputs specific to your task. I'm a hardcore Gemini user, and I recently learned about a prompt engineering technique called meta-prompting. Essentially, this method uses AI to generate prompts that will eventually be fed into a chatbot. It's an easy way to improve your prompting skills while barely lifting a finger, and if you aren't using it, you're starting behind the curve.

Broad instructions used to generate specific prompts

Using meta prompting to generate an image with AI. Credit: Brady Snyder / MakeUseOf

If you're new to the world of prompt engineering (like I was just a few weeks ago), this Prompt Engineering Guide is a valuable resource as you learn the various strategies for writing successful AI input prompts. Let's start by explaining what meta-prompting actually is—it's a prompting technique that helps AI chatbots like Gemini understand how to approach a problem. Meta prompts add structure and syntax to guide the LLM powering a chatbot in the right direction.

You can write meta prompts yourself, but a quicker, easier way to start meta-prompting is to ask Gemini to generate the detailed inputs on its own. I got this idea from Google DeepMind UX Engineer Anna Bortsova, who uses Gemini to create meta prompts that are fed into Google AI tools like Veo 3. Bortsova's prompts created with Gemini can be pages in length, adding more detail and specificity than you'd get with instructions written by a human.

"There are no rules here—we're experimenting—but I’ve found a few things that help steer Gemini to really rich prompts," Bortsova writes. "You want to define a very specific task: 'write a detailed prompt that an LLM will understand.' And you want to be clear about your format and style: say, an 8-second stop-motion animation of paper-engineered scenes. Then give it constraints, such as foil or shiny paper, rather than just general paper. Then let it do its thing."

If there's anyone who knows how to use Gemini best, it's a Google DeepMind engineer. So, I used Gemini to create my own meta prompts, and the results were stunning.

ChatGPT interface with a prompt input field and several twinkle icons around.

Related

Try This AI Prompting Formula and I Guarantee You'll Love the Results

If you want AI to work for you, this is the formula you need to know.

Meta-prompting changed the way I use AI

Gemini helps me generate photos and build custom apps

Meta-prompting with Gemini expands what is possible when using AI apps, because detailed and optimized prompts become the standard. When I used Google's Antigravity app to build the custom Android and web app pictured above, I didn't write the prompt given to Antigravity. Instead, I gave the following prompt to Gemini, asking it to write instructions for the agentic integrated development environment (IDE):

Create a detailed prompt for an AI-based IDE that will build a simple play-counter app for every title in this user's discogs library: https://www.discogs.com/user/dippinthrutherecords

There should be a simple interface in the web app where I can tally how many times I've played each release and see a ranking of the ones I've played most

only include albums that I have in the discogs collection

Gemini returned a detailed prompt that, frankly, I don't understand much. The meta-prompt created with Gemini is extremely specific, defining the structure and syntax Antigravity needs to use to fulfill my request. It includes deep coding knowledge to point the AI agent in the right direction. All of this came from the very simple prompt you see above, and I was thrilled with the results.

Here's the meta prompt Gemini came up with for Antigravity:


# Role

You are an expert React/Next.js developer with experience in third-party API integration and local state management.

# Project Goal

Build a personal web application called "Vinyl Spin Counter."

The app must fetch the record collection of a specific Discogs user and provide a simple interface to track how many times each record has been played.

# Data Source

* **Target User:** `dippinthrutherecords`

* **Source URL:** https://www.discogs.com/user/dippinthrutherecords

* **API:** Use the Discogs API (`GET /users/{username}/collection/folders/0/releases`) to fetch the collection.

* *Note:* Focus only on the standard "Collection" (Folder ID 0). Do not fetch the Wantlist.

# Tech Stack

* **Framework:** Next.js (App Router).

* **Styling:** Tailwind CSS.

* **Icons:** Lucide React (for play buttons/ranking icons).

* **Data Persistence:** Use `localStorage` for the MVP (Minimum Viable Product) to store the play counts so data persists between page reloads.

* **State Management:** React Hooks (Zustand or standard `useState`/`useEffect`).

# Core Features

## 1. The Collection Grid (Home View)

* Fetch all releases from the `dippinthrutherecords` collection.

* Display them in a responsive grid (using Tailwind).

* **Card Layout:**

* Album Art (Thumbnail).

* Artist Name & Album Title (Truncated if too long).

* **Prominent Counter UI:** A large number showing "Total Spins" with a large `+` button to increment the count and a smaller `-` button to correct mistakes.

## 2. The Leaderboard (Stats View)

* A separate tab or modal showing a ranked list of the "Top 10 Most Spun Records."

* Sort the data based on the highest play counts stored in the local state.

## 3. Search & Filter

* A simple text search bar at the top to filter the grid by Artist or Album Name instantly.

# Step-by-Step Implementation Plan

1. **Setup & Types:** Initialize the project and define the TypeScript interface for a `DiscogsRelease`.

2. **API Client:** Create a utility function to fetch the collection data. *Important: Handle pagination if the collection is large (Discogs pages results).*

3. **State Logic:** Create a custom hook `useSpinTracker` that:

* Loads play counts from `localStorage` on mount.

* Merges play counts with the fetched Discogs data (using the Release ID as the key).

* Exposes `incrementSpin(id)` and `decrementSpin(id)` functions.

4. **UI Components:** Build the `ReleaseCard` and `Leaderboard` components.

5. **Assembly:** Assemble the main page.

# Special Instructions

* **API Token:** If the Discogs API requires authentication for this public collection, allow me to define a `DISCOGS_TOKEN` in a `.env.local` file and use it in the headers.

* **Aesthetic:** Dark mode by default. Use a "Vinyl Record" aesthetic (dark greys, circular motifs for avatars).

Let's start by setting up the project and defining the API fetch logic.


With only a few follow-up prompts, Antigravity was able to create my custom Vinyl Spin Counter app based on Gemini's instructions. The working, installable Android APK demonstrated all the benefits of meta prompting. The final output was better than it would've been had I given Antigravity a simple, human-written prompt. Additionally, the meta prompting strategy was more efficient, requiring fewer requests. This helps power users avoid hitting rate limits or wasting tokens too quickly.

While you can use an AI chatbot to help write meta prompts for complex tasks, it'll work for simple ones, too. In the gallery below, you can see the difference between images created with prompts I wrote myself and ones made with meta prompts from Gemini:

In my view, there are very few situations where you wouldn't want to have an AI chatbot write desired prompts on your behalf. There's almost no downside—it's often quicker than painstakingly writing a custom prompt while including additional details that will improve the quality of Gemini's output.

Anytime specificity matters, use AI to write prompts

Meta prompting explaining the logic behind a prompt. Credit: Brady Snyder / MakeUseOf

By now, you're probably wondering when to use Gemini to generate meta prompts. Meta prompting can be useful for any LLM-related task, but it's most helpful for tasks and topics you aren't familiar with. For example, when I used Gemini to write a coding prompt for Google Antigravity, the meta prompt it generated included computer science knowledge I would've never known to include.

Whether you're trying to generate an image or build an app, meta prompting with Gemini can help you achieve better results.

Read Entire Article