Mar 31, 2026 Turnkey Trading Partners (“Turnkey”) has long recognized that the National Futures Association NFA is likely to increase oversight of AI tools. Early on, inquiries were relatively simple—firms were merely asked whether they used AI in their business. More recently, however, Turnkey has observed a shift toward a standardized, in-depth approach: the NFA is now deploying a detailed Artificial Intelligence Questionnaire (“AIQ”) alongside its examinations. At first glance, the AIQ appears straightforward: it seeks to understand how member firms are using AI in trading, risk management, client communications, and surveillance. But there are two distinct takeaways. First: the NFA is asking firms to provide detailed information about their AI usage. This includes everything from large language models (“LLMs”) like ChatGPT to predictive modeling, third-party tools, and internally developed systems. More importantly, it isn’t just about usage—examiners are focused on governance, monitoring, risk management, and policy alignment. Firms need to show that AI is integrated responsibly and under proper oversight. Second, the subtler implication: While the NFA has not explicitly confirmed the use of AI in its audit process, the structure and depth of the questionnaire point toward a more systematic and data-driven approach to examinations. The AIQ standardizes how information is collected, making it easier to identify inconsistencies, gaps, or weak controls across firms. Whether powered by advanced analytics, automation, or simply more structured review processes, the result is the same: vague or unsupported responses are far more likely to stand out than in prior exam cycles. What the NFA Is Actually Asking At a high level, the AIQ breaks down into four core areas: AI Utilization Where and how AI is being used across the firm Whether tools are internally developed or third-party Use of LLMs and controls around employee access Any application in forecasting or predictive modeling Governance Written policies and procedures governing AI use How firms monitor employee compliance Identification and mitigation of conflicts of interest Monitoring & Training How models are trained and what data is used Ongoing monitoring for accuracy, bias, and degradation Cybersecurity considerations and data protection Oversight personnel and qualifications Specific Use Cases AI in AML, fraud detection, and surveillance Use in risk management or trading strategies Client-facing tools such as chatbots Whether AI-generated outputs are distributed externally How AI usage is reflected (or not) in promotional materials The Real Risk: Saying Too Much or Too Little Where firms tend to get into trouble with something like this is on either end of the spectrum. Some will understate usage—treating AI tools as “informal” or outside the scope of supervision. Others will overstate capabilities in a way that isn’t supported by policies, controls, or documentation. Both approaches create the same problem: a mismatch between what’s said and what can be substantiated. And that’s ultimately what this questionnaire is built to expose. How to Approach It The most effective way to think about the AIQ is not as a technology disclosure, but as an extension of existing NFA expectations. If AI: touches trading – it’s a risk management and disclosure issue touches communications – it’s a supervisory and promotional material issue touches data – it’s a books and records and cybersecurity issue Nothing about those standards is new. What’s new is the level of visibility. Bottom Line The AIQ isn’t just about artificial intelligence—it’s about whether firms are applying the same level of discipline to AI that they’re expected to apply everywhere else. To prepare for your next NFA audit contact Turnkey Trading Partners today.