I recently reposted a Linkedin post by Autodesk’s Daniela Steinsapir on an upcoming event called Hack the Future, Generative Read Team Challenge, which will be held at Autodesk University - the company’s annual design and make conference. As I heard a bit more about this “red teaming” activity, I wanted to learn more so I reached out to Daniela. Here’s the interview – enjoy! If you have further questions, please reach out to Daniela Steinsapir directly.
Aliza (that’s me): Hi Daniela, Thanks so much for taking the time to share a bit more about this red teaming event that will be held at Autodesk University. I appreciate it.
Daniela: Hi Aliza. It’s my pleasure and I’m glad to share.
Aliza: Let’s jump right in – please share a little about yourself.
Daniela: Sure. So, I'm a senior manager of experience architecture in the Platform Services and Emerging Technologies group at Autodesk. I work across out of this research and strategic technologies. I've been an experience designer, experienced designer for the last 20 years and I have a background in educational technology.
Aliza: Thanks, Daniela. It’s great to learn about your background. I recently reposted your Linkedin post about an event at AU. I would love to learn more. Can you tell us what’s happening?
Daniela: Yes. Autodesk University is a very large conference, where Autodesk hosts our fabulous customers to come together and to really learn what's new learn and gain new skills. Also, It is an opportunity for “Autodeskers” like me to really learn and get feedback from our customers.
So, this event is our big “show and tell” where we get to share our future plans. We also host classes and conduct product demonstrations. I would describe AU as a big “learn and share” conference.
Aliza: Oh, I like that. So, what are you driving at Autodesk University?
Daniela: We are hosting the first-ever, hands-on red teaming experience at AU. This is the first time our customers, with the guidance of Dr. Rumman Chowdhury, will learn about red teaming.
In the last ten years, Autodesk Research has conducted unique and original research in AI and machine learning. One of Autodesk’s goals is to build responsible AI and, as a part of this goal, we are using this red team workshop as an opportunity to learn more about the trust and ethical concerns our customers have. We will offer the opportunity to a select group of customers to perform red teaming exercises and to learn about this unique research. We're super excited! We will be giving early access to different models and experiences where our customers will help us identify their concerns and potential negative outcomes.
Traditionally, we think about “red teaming” in the field of cybersecurity. For AU, we're actually bringing this practice to look for negative outcomes with generative AI and see what things might go wrong with AI within the fields of construction, architecture, and in manufacturing. We are learning as we go and we cannot wait to share, test, and learn from this experience at AU.
40 customers will participate. We're also hosting feedback sessions—discussions to learn about the potential negative outcomes around trust and ethics. We intend to build responsible, transparent, ethical AI tools. We want these AI technologies to deliver value in a way that is safe for everyone and their organizations.
As a community, when we practice red teaming and identify concerns, we can join together and make it better for everybody. In a nutshell, that’s what we're doing at Autodesk University. We are gathering as a community to have conversations about trust and ethics for generative AI within the industries we serve.
Aliza: Wow, this sounds exciting. Daniela, do you mind if we go back a little bit and describe red teaming a bit more?
Daniela: Absolutely. I’ve got a great summary. This is a new field. Red teaming is the exercise that involves questioning plans, policy, assistance and assumptions through an adversarial approach.
At Autodesk, we recognize the importance of responsible AI development and implementation. At AU, we're inviting a group of customers to experience this hands-on red teaming exercise. So, what do we mean by red teaming? It’s the practice of actually testing things with a focus on potential negative outcomes. We try to create the scenarios that actually trigger things to discover what negative outcomes might emerge. I don’t mean to sound alarmist. The ultimate goal is to create responsible, ethical, and transparent technology that offers dramatic benefits for customers and value for their organization or business.
The more that we learn about red teaming, the more we learn how to identify and trigger negative outcomes. Basically, we will explore how to trigger negative outcomes in these exercises in order to prevent or mitigate them.
At AU, we will ask our participants to get into this mindset of triggering for negative outcomes or to look for negative outcomes. Then, we can actually learn how to address them. We will then ask our customers why something is wrong and how we could respond.
We will be exploring things like intellectual property in the field of architecture that we believe that our customers are concerned about., we are approaching these important topics to explore what concerns exist and why. We’ll get examples and we’ll start to unpack these themes to learn how to deploy AI responsibly, learn how to mitigate any potential risk, and learn how to make AI transparent, better, and safer for all.
Aliza: I know you only have 40 spots, but how can someone who is going to AU participate?
Daniela: Right now, we still have space for more participants to submit an application. Customers attending Autodesk University can visit Autodesk Research blog: https://www.research.autodesk.com/blog/participate-in-ai-research-at-autodesk-university/ Seats are limited and participation is by invitation only.
Aliza: I learned so much. Thanks again for taking time to share with me.
Daniela: My pleasure and we are excited for this opportunity at Autodesk University.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.