Cezary Gesikowski
1 min readNov 29, 2023

--

I've used various commands in Instructions. It took a lot of testing and I've asked white-hat hackers to break through -- they were able to infiltrate the commands and get though. There is no way to protect the knowledge base in GPTs, secret files have to be hidden behind Actions call via properly secured API. You need to be a coder/developer to set that up properly. It appears GPTs have been constructed that way and both instructions and knowledge file you upload to them are public. It is possibly a way to prevent people from using GPTs to distribute copyrighted materials (books/manuals/etc.) You can search gptshunter.com website to find GPTs and the names of files uploaded to them, here is one of my GPTs in their listings: https://www.gptshunter.com/gpt-store/MjQxYjE3MTU1NDBlMjYxNDJm

If you are interested in the OpenAI developer forum discussions, check out this thread that explains it in more detail: https://community.openai.com/t/concerns-about-file-information-extraction-from-gpts-uploads/521501/9

--

--

Cezary Gesikowski
Cezary Gesikowski

Written by Cezary Gesikowski

Human+Artificial Intelligence | Photography+Algography | UX+Design+Systems Thinking | Art+Technology | Philosophy+Literature | Theoria+Poiesis+Praxis

No responses yet