All Resources

CISO to CISO Webcast with Andy Steingruebl, CSO of Pinterest

Webcast and Podcast | Altitude Networks, September 8th, 2020

We are excited to welcome Andy Steingruebl, CSO of Pinterest, as our guest on the next CISO to CISO Webcast episode. 

Andy Steingruebl has joined Pinterest in November 2018 to be their first Chief Security Officer. He's been in security for over 20 years, collecting experiences in a multitude of areas of security, especially within highly regulated environments like healthcare and financial services, including at PayPal for more than 12 years. 

Read, Listen, and Subscribe to the Podcast

Sept_9_2020-Andy Steingruebl_CisoToCiso_trimmed.mp3 transcript powered by Sonix—easily convert your audio to text with Sonix.

Sept_9_2020-Andy Steingruebl_CisoToCiso_trimmed.mp3 was automatically transcribed by Sonix with the latest audio-to-text algorithms. This transcript may contain errors. Sonix is the best audio automated transcription service in 2020. Our automated transcription algorithms works with many of the popular audio file formats.

Michael Coates:
Welcome, everyone. This is another edition of CISO to CISO webcast, I'm your host, Michael Coates, I am the co-founder and CEO of Altitude Networks, the previous CISO of Twitter. Hence my ability to be here in a seasonal capacity and super excited today to have Andy Steingruebl with us. He is the CSO of Pinterest. He's been at PayPal previously for over a decade. If I'm doing my math correctly, a deep background in Unix, from what I can tell. And just super excited to talk to you today. Thanks for joining us, Andy.

Andy Steingruebl:
Yeah, happy to be here. Great. And as we dive in very quickly, this podcast webcast is brought to you from Altitude Networks. We are solving data security and cloud collaboration platforms like G Suite, Box, et cetera. So if that is a challenge you are facing, check us out at altitudenetworks.com. With that, let's dive in. So, Andy, I'm always curious to hear the journey that a CISO or CSO takes. It is a unique role that we end up in. And what I found is the path that people get there is always different and varied. So talk about your path. Did you always want to be head of security or did you just find yourself there one day handed the baton as someone ran out the door?

Andy Steingruebl:
Yeah. You know, it reminds me that there was a Monster.com commercial once they had the little kids and they're interviewing him. When I grow up, I want to be a controller for a or something. No, I didn't always want to be a want to be a CISO or a security person. I started doing computing stuff when I was nine, so I learned to program when I was nine on an TRS-80 and my dad, my dad working in computing and so on. But, you know, I went to school for first engineering and then physics and then turns out I'm not very good at either of those and ended up getting a degree in philosophy. But, you know, I think you you find out a lot about yourself on on not how you say you want to spend your time, but on how you actually find yourself spending your time. And so I found I found that I was spending all my time taking care of the Unix systems in the in the lab where I worked at at school, turned that into a full time job. And I tell people that, like being a UNIX admin at a university, especially back in the 90s, like half to two thirds of the job was security. And the insiders were the threats. You know, the your your fellow students or recently fellow students, one of the people who were trying to cause trouble get back when quotas and limited disk space and things like that were a concern. So you you spent a lot of your time on security, hardening systems and so on. I had a great mentor, a guy named Bob Bartlett, who taught me just tons about Unix, Unix security, how to automate things and so on. Even then, there was more than you could do manually.

Andy Steingruebl:
So automation was a was a big thing. And I did that for a couple of years at a university and then at a pharmaceutical, which is a very different environment, and then took a job full time as a security head at a startup, because I found that I was spending even more of my time on security stuff. So I did that for a couple of years and then and then ended up moving from Chicago out to the West Coast here and took the job at PayPal. I, I moved back and forth between IC and manager roles a couple of different times in my in my career and helped build the PayPal security team from like four people to up to when I left, I don't know, two hundred something like that. Didn't do all of that obviously. But it was a it was quite a journey of going from a small team of a really big one. And when I when I left PayPal, I knew the next job I wanted was a CISO job. There's times where you want to, you have ideas of what actually works and doesn't work and you want a chance to be responsible for them and get to try them. And so I knew that that was that was what I wanted. My next job to be was to try and run a security program and be responsible for the whole thing and wanted to do it at a at a place that was in the in a big growth phase and was an Internet business. I think that's where my skills are. So you try to you try to match some of your skills and experience to things. And so Pinterest came available and and it was a good match, I think.

Michael Coates:
So it's fascinating the the origin of your dabbling in technology, you know, working on the home computer, so to speak. That was also my entry point. But also, as you mentioned, the university lab, I was in a similar position. I was the sysadmin of a lab in the psychology department. That's where I learned my initial security skills because. Possibly like you like, well, this seems like an interesting field, but where exactly is that line where it's legal versus illegal? I don't really need to shortchange my college career by getting arrested.

Andy Steingruebl:
Yeah, it's funny how things have changed in that world. Like it used to be that when there was an incident, somebody broke into your computer. Some people would break back into the to the origin of the attack at the other university or something and and call that person on the phone and say, hey, by the way, you've got somebody who's broken into your computer and they just broke into ours and here's how they did it. And if and if you log in and go look in the temp directory and you'll see a directory called blah, blah, blah, and, you know, now the CFAA and others say that's a really bad idea, back in, I don't know, 93, statute of limitations or something. But the the back in 93, that was just how the world works. You just helped everybody out. It was a big community and so on. The world changed a little bit on that front, but not entirely. We're still, I think, a pretty tight knit group of of security folks and so on who know each other.

Michael Coates:
Yeah, I agree, the community element is is huge. I think as we tell other people, like, sure, companies that we work at may be competing on various levels, but as security professionals are kind of all in it together unified against this problem, that's plenty big. So we might as well help each other while we try and tackle it.

Andy Steingruebl:
Yeah, you know, I tell people it's the old joke about like the two hikers come across a bear in the woods, one bends down to tie his shoes. And I think we all know the story. Hey, you're not going to run the bear. I don't have to outrun the bear, just you. But I find it in insecurity. It's more often. It's more often not that case. And instead, we're trying to, like, starve the starve the predators thing. Robert Hanssen, has a cool, cool anecdote about that involving prairie dogs. But, you know, I think we're more often trying to just make sure that that the bad guys don't win because it gives them fuel for another day.

Michael Coates:
So very true. Very true. Well, speaking of bears, a bit of a Segway each CISO to CISO webcast, we pick a virtual location to be virtually at. Andy, tell us a little bit about where we are today and why our shirts appear to match, even though we didn't coordinate this.

Andy Steingruebl:
Yeah, I mean, that was just luck, by the way. But the so we chose for folks who are in California, this won't be a surprise. But if you're not from California, you haven't been following along, there's been a lot of fires here. And one of the or the oldest state park in California had a big fire. That hasn't happened in recorded memory, really, 50, 60 plus years. So big Basin Redwoods Park in the in the Bay Area. It's a true gem. Had fires come through, it burned down the visitors center and a bunch of buildings. But luckily, a lot of the great redwoods are still standing there pretty resistant. So we chose that as our as our backdrop for a place, if I could be there today. Not that this isn't fun, but if I could be just out hiking in the in the redwoods, that'd be pretty awesome.

Michael Coates:
Agreed. Agreed. So thinking about the you know, the CISO role as you as you have it, there's so many different risks that we encounter. We think about changes in technology. We think about different teams that are making decisions on their own, many of which we don't have full authority to say, hey, you can't do that Persay. They're in different parts of the organization. So we have to, you know, lead by influence to some degree. But there's a lot of different ways you can tackle that. What have you found is kind of a successful approach for you? And do you think that it's unique to the company or the industry that you're in or maybe can work for all CISOs?

Andy Steingruebl:
Yeah, that's what I'm struck by. Like, the more I learn, like how how every company and situation sort of some stuff's replaceable. A lot of skills are transferable, but the culture at each organization is different than how you solve problems is different. Pinterest is is really great from a collaboration perspective. So, you know, I had a former boss, Michael Barrett, once. They like all I am as a teacher, like half my job is just educating people on things. It's not using authority to go do stuff. It's sort of authority to convene a meeting and educate people about a risk and in a in a place that that takes it seriously. And that was one of my criteria for taking it, taking the job. So I'm I'm pretty lucky that way that I work at a place that really cares about security. And and so my job is is often to to see a problem and get it pointed out to me or discover it myself, ask questions and then convene the right people to solve the problem. So they have my job is just explaining to people that this thing is a problem. Something could happen here and Pinterest is quite receptive. Like I can't admit I can't remember a time. And it's been it's coming up on two years where I've said, hey, this is a problem. And somebody said, oh, I disagree. It's more like, OK, tell me more and then let's figure out what we should do about it and on what sort of time frame.

Andy Steingruebl:
And and so my, I view, part of my job is as being the convenor of the right set of people to solve a problem and and helping vet the solution. But the solution doesn't always pull the right people in that are smart on the on the various things, whether it's a product thing or an engineering thing or you don't even know. You pull a group of people in and you help them work through solving a problem. And very frequently the best solve doesn't come from the security person or or sometimes even the engineer. We had a great a great problem that came up a product issue. And and the best solution came from the product manager who found that it came with a really elegant solution that both solve the security problem and preserved a great user experience, which is which is key. Like I wouldn't have thought of that. Because my my focus was was mostly on the technical side of it and not the user experience piece of it, and and product managers have a different lens that they that they look at things with and so know pulling together the right in a diverse set of people who look at problems from different ways. That's that's really how you solve. How do you solve problems. And sometimes that's just engineers. But other times it's, hey, you need a lawyer or a product person or a design person or something like that to help to help look at one of those problems.

Michael Coates:
It's it's fascinating because, you know, at the beginning of our careers and security, many people think of excellence and security to be building the most bulletproof system, possible period, full stop. And as you mentioned, you know, as you rise up in the roles, you're not actually trying to build this. Completely impenetrable box, of course, we want to be secure, but just like you said, it's more about that education element and even bringing the people to the table. And to some degree, as what I found, it's even giving them enough knowledge that they can take reasonable risks, because if we got them in a spot where it's not so secure that nothing can go wrong, it's also probably not very usable, as you mentioned, to UX perspective. Yeah.

Michael Coates:
I mean, it's important to understand which security problems should really be solved by the security team and which security problems are better solved as a as a not a technical problem, but as a sort of a balancing of of two good to competing sort of goods or interests. Think, think account security like the on the one hand you want to your users and the company wants to stop attackers from getting into accounts, but you also want your users to get into their own accounts. And if you make it too hard then then nobody gets into any accounts. And that's also not good. You know, the regular users don't find the product usable, don't use it or don't want to use it or whatever. So there's those balancing things are things where you need, you know, not just security folks, but product managers, people who can do user testing and experimentation. It turns out a lot of these problems don't have know they have. They don't have our priorities so close to them, you can only figure them out through testing and through experimentation of what actually works. You can't come up with an engineered solution ahead of time until you try it out and see how people react to it.

Michael Coates:
Yeah, the account security reminds me of a kind of classic realization where as a security person you may say, oh, well, two factor of SMS is, academically insecure, therefore, cannot be used, period, we must force everyone to use a app on their phone for their their token, and then you have a product manager, someone say, well, we have a global audience and only 50 percent of our global audience even have smartphones. And then we do. All right. This just all went out the window. And those kinds of views and realizations you have to have in the room to build any sort of usable and secure system.

Andy Steingruebl:
Yeah, the world's a much bigger place. I mean, it's why you try to you try to get people from various backgrounds, from different parts of the world who've had different experiences and so on, because the you know, my my own experience sitting here in in San Jose, California, is not indicative of the of the whole world.

Michael Coates:
Yet, despite the continued funding of food delivery services and dry cleaning on demand, you know, building startups for Silicon Valley. Yes. We are not representative of the world for sure. So so, you know, another interesting element of security is, is how it spans both technologies, policy process, you know, almost how you engage with the company itself and the employees. But how how do you think about balancing those elements? You can you can imagine the stereotypical West Coast company is very open and everybody do whatever. Just trust people. You've got the stereotypical bank where it's all original lock down. And of course, those are stereotypes. How do you think about that in building a company that can be a successful company and a secure and safe company?

Andy Steingruebl:
Yeah, you know, I tend to think of it. I mean, a minute ago we talked about, you know, freedom and other things. But this is a case where I think what you do is you you you solve the things with automation and with rigid process that makes sense to and you leave degrees of freedom for people to then innovate with a few things fixed in place. So, you know, the like the certain problems really are fixable. And they think fishing in the internal in the internal context, like I could I could go and train everybody how to do that or I can go and deploy FIDO U2F. I mean, I was part of the part of the group that championed FIDO in the first place, and I wasn't the one who did the work. Much smarter people working for me who worked on the protocols. Jeff Ruber. They're probably listening, folks like that. But the but but, you know, you invent or you deploy certain of these technologies to solve problems in a deterministic way because you only have limited attention span and budgets. And I don't mean that, just that I don't have infinite funding. Nobody does. But there's only so many things you can go after. And so you figure out what you can automate, what you can make as as fix things and solve them comprehensively that you don't have to worry about anymore so that you can spend that same brainpower and time on other things that defy easy solutions.

Andy Steingruebl:
So, you know, things like, hey, come up with with a standard set of web frameworks and so on, that that solve a bunch of common Web security problems to the extent possible, cross site scripting, CSRF, et cetera, make those standard and get everybody to use them. And now they can innovate around what products they build without having to to touch some of the underlying componentry. And you don't have to train them on it. It just works by default and so on. To the extent you can use it as a managed language, use something that that's memory safe so that there's just a certain set of things that you don't have to worry about. And then it frees you up to spend that same time on the other problems that still exist that you can't automate away in your environment. So, you know, it's not that there aren't some things that you can definitively solve, but not all problems can be definitively solved. And so then that's where you want to go, spend your time.

Michael Coates:
Yeah, I love that, I love that approach, use leverage computers to automate what can be automated for sure. I found the same benefits. I guess it was years ago now back in my time at Mozilla where we worked to invert the default web framework because we found ourselves doing kind of what you said is like, oh, we would we'd go into assessments and look, you've got you don't have a secure cookie flag or you don't have output coding enabled in this area. And then I go, this is silly to spend human time on this. Why don't we just change the framework to turn these on by default and then, you know, people inherit that if it breaks their code, at least it breaks up when they're writing it and they can fix it instantly versus later.

Andy Steingruebl:
Yeah. And you can you can detect deviations from from normal. Right. So you have a standard configuration. And if everybody's doing the thing that you think is safe, then you only have to go and monitor for people who've turned it off or tweaking those defaults rather than checking that everybody is actually turning them on. Just make sure that if anybody has to turn them off, that's where you go. Spend your spend your time. So, you know, and it's automation, right? It's again, this and I've learned too many hard times in my career that you go and you you put in place something. You train a bunch of people to go do a thing manually, repeatedly. And then over the course of years, those people change jobs and roles and now that thing stops happening because it wasn't automated, it wasn't part of the framework, it wasn't just part of the automation that happens. And there wasn't a you know, sort of like if you didn't build a regression or unit test for it, it's not going to stay that way. It's the same sort of sort of thing to automate the things you can to cut down on how many things you have to worry about. Then there's a set of known good things and you can spend your time on the other stuff that are more challenging problems.

Michael Coates:
Now, what's fascinating about setting default secure for developers and the tech stack is applying that logic and how it almost doesn't work when you think about the user base. And so when you think about how humans in general and computers interact and that product experience, things get a little bit tricky because we could set kind of to your point of authentication, we could set it to be very secure by default. Like give us your 50 character password and chaos would break loose. You probably encountered a lot of the human computer security conundrums, both on Pinterest and PayPal.

Andy Steingruebl:
Yeah, I mean, it's funny. It's like I used to think that that. It's a hard challenge, though, on the one hand, you want to enlist users help in staying secure. And on the other hand, what I don't know, I think I think the Chrome team has published some stats that like what, only a few percent of people ever even go and look at the settings inside the Web browser and try to adjust them. So if you're expecting users to go in and make meaningful choices about what they want those defaults to be, it's entirely valid to have them there and make them adjustable. I do think users need to own their computer and be able to tweak and configure and so on. But you need to to make sure some of those defaults are set. Right. So we you know, we took that approach when we did HTTP's strict transport security. You know, it was because we couldn't rely on people making good security choices. So we made the computer make it for them and let a site, you know, a website set the policy it wanted so that we weren't actually relying on users to make really smart security decisions.

Andy Steingruebl:
And that's about setting, setting secure default. But then when then you're like, what should the warning be when something goes wrong? And the only way to answer that is by doing actual research. I think what Adrian over at over chromes, over at Google, she's published a bunch of great stuff on on redesigning browser security warnings and so on. And you you only find out what works through actual experimentation on people and testing how they behave. You can't figure it out by just designing it and saying, now we've got it, now we've got it right. And you think you've done that? You do. You know, I used to believe in EV certificates until some people proved how silly they are. They don't actually work. Nobody pays attention to them. And we can prove that through through actual study. And so we stopped believing in that. But it took really it took it took research to to maybe it should have been obvious, but it took some research. And then you realized that that doesn't actually work. So.

Michael Coates:
We sure come a long way, I think, back to some of our browser warnings before about mixed content on the page and of course, makes perfect sense to us. Like all mixed content, I got to watch out. The images might have been replaced, but everybody else in the world was like, I don't know, I just want to see the sports scores or I just want to log in to my bank. Do you tell me should I continue or not? Yeah, and we're making good progress in that. And I always give that analogy like you get in your car and you want to drive, like you don't want the car to ask you, like, well, the left brake is, you know that. And are you sure you want to go? Like, I don't know, just please just work or don't go.

Andy Steingruebl:
Yeah, yeah. I mean, there's there's interesting lessons to be learned from other domains when you think in the US anyway, was it the late 80s, early 90s when there were mandatory some mandatory seatbelt standards. And what there were there were a few years of cars that had the automated seatbelt where the shoulder belt was on a rail and you got in the car and you can you close the door and turned it on the belt, came up and and so on. And people found it so annoying, like, yes, it was on by default and you couldn't help but put on your shoulder belt. But the problem was that some number of people found that so annoying that they actually disconnected it altogether rather than put up with that security control. So I you know, I think we're always in the business of trying something, but relying on on some real evidence when we can. Sometimes you can't like it. We're not expecting like we don't design crypto systems by trial and error and just figure out if they break. We try to use math and science and so on to to figure out that one of them is actually robust. But there's a lot of security problems that don't don't have that sort of solution to them and the only way to figure it out is by is by testing it and seeing what what happens.

Michael Coates:
Now, if you were thinking about the security field or if you're at a company and you're thinking, what should I care about? You may watch the headlines because that's something that you could care about, or maybe you even watch a few movies. And I think if you did, you would walk away and hire a bunch of people to sit in a dark basement and pound the keyboards loudly because that's how you hack. I would posit that you'd be largely misguided. Yes, but how do you approach that in some sort of repeatable, verifiable way? And by that, I mean, how do you even know what to focus on and what are your threats there? They're definitely varied by company and industry and other factors. How do you think about it?

Andy Steingruebl:
What's funny, but just on that note, by the way, for movies, I hope Eli Sugarman is successful in coming up with new imagery. I think he's got the the program out there willing to fund people to come up with, you know, instead of hackers in hoodies and so on. We're going to come up with some new iconography and so on for for what computer security is my my best wishes to to that project. Success. Yes. I don't I don't really wear hoodies. So, you know, I mean, there's frameworks like the the Nist Cybersecurity Framework is it is a decent start of of walking through and trying to do some sort of assessment for yourself. I'm a bigger and people talk about risk assessment, but I, I tend to think about it in terms of capabilities more often because figuring out underlying risks is really hard. But then in a in a given business, I think it relies on back to back to some sort of education and dialogue with the with your executives. If you're if you're a CISO, with your your partners across whatever business unit you're in. I'm climbing our engineering organization, Pinterest, but it's, you know, talk to your to your legal team and to the finance team and others. And you figure out like what incidents like walk through some scenarios and say, if this happened, what would our reaction be? Would we say, hey, that's absolutely not allowable? Under no circumstances could we ever have that happen? Or is this a situation where if that happened, we'd say, hey, we we tried to make the product more usable for our users and they pick a nonsecurity domain.

Andy Steingruebl:
It's like you're going to move fast and create all sorts of new features that comes with some risk that you're going to cause a bug, some sort of outage and balancing between those two goods of, hey, we want to move fast and create new features that that we like and our users like also comes with the risks. Are you going to cause some sort of outage balancing those those two things like the users would like new cool features and they'd like the service to be up and maybe you don't get it perfect every time, we try to balance those two competing good. I think security, you know, you get some hard must do things and you can you can look at common attacks like you're an Internet business that has IPs and other stuff exposed to the Internet. There's some amount of just attacks that are going to hit you every single day from sort of the background radiation, if you will, of the Internet. If you're not secure against those. If you're running, you know, old, unpatched. If people are going to find out there's lots and so so there's a certain set of basic practices that I think you can assume you have to do, because they're practically guaranteed to result in you getting owned if you don't do them.

Andy Steingruebl:
And then beyond that, it's. Who do we think our attackers are? What do they want? You know, look at my last job. We had billions and billions and billions of dollars of of lots of people's money and our own money, you know, moving money around, like, attracts a certain kind of attacker. That's not my current role. We're not moving. That's not our moving money around and doing payments. So we're not attracting those same sorts of attackers. So understanding what what there's some generic Internet adversary and then there's who's actually interested in us? What do they want? How would they monetize it? And thinking about all of those and running through scenarios on it. Like like I said, the cyber security firm makes a good start to just think comprehensively about all the things you should be doing, all the different domains of security that may not be top of mind. Maybe you don't think that much about internal access control or maybe you don't think about things that are easy to forget is like device destruction. When when you or were you doing full disk encryption on all your laptops, look, that's a pretty common scenario that can you can sort of easily forget about or something. So having a good comprehensive guide like that is is a good practice to walk through.

Michael Coates:
Yeah, this frameworks are really nice starting spot. I agree completely. One thing I found that's an interesting addition to them is an executive tour where we discuss with them sort of like what would be the disaster scenario in your mind? Like what what happen if it draws some really interesting leaps between the technical controls we know we need to have and then the business processes and how they map together? I've always found it to be really good firepower like, well, you know, I really wanted to upgrade the security of this system, design, etc. and maybe you don't have as much firepower for that on its own. But then you say and by the way, it ties to these core critical systems that we've determined are, you know, must not happen events.

Andy Steingruebl:
Yeah, yeah. I mean, that was my first that was literally my first my first month on the job and on an ongoing basis was doing those interviews and saying, what keeps you up at night? What are the things that you that you're worried about either in scenarios or things that you don't know enough about how well we're doing something. And so sometimes, you know, sometimes, like I said, my job is just is education. You find either and you pick your spots, you say, hey, here's you mentioned like things in the news. Like I use those as as an opportunity to have a conversation with my with my colleagues when when something happens in the news, you you write up a summary on could that have impacted us? Are we exposed to that or are we not? Either of those is actually OK. It's either one. Here's an opportunity for us to go and improve or alternately that wouldn't have impacted us. Here's why it doesn't happen by accident. We we do the following things and now that's not an issue for us. So so that's great news. You may have been wondering if this was an issue for us and it turns out it's not, which is also a great story to be able to tell.

Michael Coates:
So, yeah, security people, we often think about the negatives and the potential bad things. Give yourself credit when you do things well, just like you said, we are in all the work we did made us not vulnerable to that headline. So our name is not going to be up there.

Andy Steingruebl:
And you don't get the you don't get the chance to talk about that very often. We we had an incident recently where somebody tried to just made a mistake and tried to accidentally shut down a whole bunch of systems, just as they were they were doing some maintenance and tried to shut down some things. And we had put in place the security controls so that they didn't have the permission to shut things down. And so it was a nice opportunity to to say, hey, look, that's a big win where because we put in some security controls, we avoided an even just a mistake and an accident. But it could have been really bad. But because we did the right thing and put in some some good controls there, we didn't have that incident. All right. You don't get that that chance very often. So tell it when you can.

Michael Coates:
Fantastic. So any one of the things I like to always end on in these discussions is a little bit of feedback to the the next potential generation of security practitioners, maybe even CISOs. What would you say what would you say to a student who is thinking about entering the field of security or someone who's in it and maybe wants to be a CISO one day?

Andy Steingruebl:
I get this question a lot. I would say for the for the more junior person, like, be excellent at something, have a really strong foundation in something, be a really good software engineer, be a really good system administrator, like understand something really deep and really well in one domain of something technical, not even purely security. And because those are some of the foundational skills I think are useful to bootstrap on top of, they give you this conceptual underpinning for understanding things. Course I say that being a student of I've got a liberal arts degree proudly from the University of Chicago and that's you know, it's all about this like Common Core Foundation that everybody then builds on top of. So maybe I maybe they train me too well, but I you know, have one or two things you feel that you're really an expert in, that you're really, really solid in because that that's a that's a point of strength, something that you can just leverage to learn other things and conceptualize them and so on. And so that's my that's my my first bit of things for how to get started and how to build yourself up. I think, as you know, every every CISO and senior security person I talk to relates that like. At some point in your career, you start pivoting from it, from many things being a technical problem into being an organizational and people, not problem, but it's got more of those elements to it.

Andy Steingruebl:
It becomes coordinating multiple people to work on something. A problem is bigger than you can solve all by yourself. It's not a technical thing that you just need this one person to go do this one thing. It's about understanding organizations and dynamics and systems and how people work together. How do you motivate people and so on. And so it's like if you find that you're you're in the purely technical space, one of the best advice I ever got, it was actually given to me to give to somebody else. But it was, you know, like as engineers, we often are really concerned about being right. And the advice was, you know, you have to decide whether you want to be right or whether you want to be effective and because they're not always the same thing. And so, you know, having your idea be the right one or whatever is not the same as solving the problem and getting stuff done. And that involves working with other people, collaborating. And so at some point, if you want to be a leader and advance in that way, it becomes about learning how to work with other people effectively to solve problems.

Michael Coates:
Very well said, very well said. Well, Andy, thank you so much for your time today. We'll go ahead and leave it there. I know we could talk hours upon hours on security and all these things, but this is great. Really appreciate your time and sharing your insights with the world. And, you know, good luck out there, everyone. It's a wild world. It's a wild place.

Andy Steingruebl:
Next time we'll all you know, we'll all actually meet at Big Basin. And instead of one on video or something, you know, back when we get so.

Michael Coates:
That would be great. Thanks everyone!

Automatically convert your audio files to text with Sonix. Sonix is the best online, automated transcription service.

Sonix uses cutting-edge artificial intelligence to convert your mp3 files to text.

Create and share better audio content with Sonix. Automated transcription can quickly transcribe your skype calls. All of your remote meetings will be better indexed with a Sonix transcript. Automated transcription is getting more accurate with each passing day. Lawyers need to transcribe their interviews, phone calls, and video recordings. Most choose Sonix as their speech-to-text technology. Create better transcripts with online automated transcription. Do you have a lot of background noise in your audio files? Here's how you can remove background audio noise for free. Get the most out of your audio content with Sonix. Sometimes you don't have super fancy audio recording equipment around; here's how you can record better audio on your phone.

Sonix uses cutting-edge artificial intelligence to convert your mp3 files to text.

Sonix is the best online audio transcription software in 2020—it's fast, easy, and affordable.

If you are looking for a great way to convert your audio to text, try Sonix today.

Subscribe for More

Get notified of future CISO webcast and other excisitng security content

Thanks for subscribing!

Ready to get your Cloud Security in Check?

Fill in some contact info below or schedule a meeting so we can reach out to provide more details on how Altitude Networks can protect you from data loss in the cloud.

We'll be in touch!
OR