ClickCease
< Back to Main

Exploring AI in RevOps: From Parlor Tricks to Real Application

Mark Lerner:

Okay, Nico, welcome to the RevAmp podcast. Glad to have you.

Nico Lafakis:

Glad to.

Mark Lerner:

My name is Mark Lerner, I’m the host of the RevAmp podcast and today I’m really excited to have Nico join us. Nico, why don’t you tell us a little bit about yourself and your background?

Nico Lafakis:

Well, oddly enough, my background is a long time in graphic design and then I shifted gears about four and a half, five years ago over to doing CRM work and being an admin on the client side. And then a couple years ago, or no, last year, this almost exact time last year, what am I, seven days away from my one year anniversary with New Breed. And yeah, got picked up by New Breed, was very, very lucky about that and have been working B2B since then and kind of all the while, very much a techie guy, very much an art guy. So just kind of that, and I would say behind that very much a sort of Sherlock Holmes is kind of mine, so I really love to take all the odds and ends and try to find the strings that bull in together.

Mark Lerner:

Yeah, I think that’s why we get along very well because you articulated some kind of my idiosyncrasies in a very interesting way because I’m very similar. And one of the reasons I wanted to have you on the pod was because last week you were a guest on one of the other podcasts that I co-host, and we talked about a topic out that obviously, especially in the last year, almost exactly, has been on everybody’s mind and everybody has an opinion on it, which is that of AI. And I wanted to kind of go do a deep dive with you on AI from the perspective of someone in rev ops, from the perspective of someone whose job is kind of tying all those things together. I’m sure there are going to be people out there that are botching and listening that are pro and are already using AI, and then there are others that may be very against it and not, I kind of want to explore all of that with you. So maybe we can take a step back and talk a little bit about how you first came across AI and what you kind of use it for today.

Nico Lafakis:

First came across it. Well, that’s difficult because a lot of people, most everybody I know associates AI with GPT and with any other type of generative pre-trained model. And realistically, AI has been around for a very, very long time. First rendition Siri, the first series rendition was Siri. That kind of thing is, I mean I’ve been into it since Blade Runner, lawnmower, man, you name it, old school eighties. I’ve seen it dune all that stuff and now seeing it start to fruition through Siri with that being that first little semblance of an assistant from ai, that’s really what kind of sparked things going forward was just like, okay, well where are we going to go from here? And then starting to see the Amazon, the Google Home and the Amazon Alexa and that kind of stuff, voice activated and then finally starting to see the smart plugs and being able to do that kind of thing.

And one of the coolest scenes, I don’t know why I keep coming back to it, but one of the coolest scenes in my mind was the scene from the Running Man when the woman that Arnold is working with first comes into her apartment and when she walks in she’s like, lights, coffee, toast, tv, channel 12 and everything is just kind of going on its own and doing what it’s supposed to do. And it just blew my mind back then. I was like, oh my God, that would be so awesome to just be able to walk in your house and talk to your house. And then we see Ironman and of course you get to see Jarvis and what that’s like. And I mean comic books taking that to a whole nother level. And then last year, so now we’re really starting to explore and expand this thing and being able to apply that to what I do every day and being able to apply that to rev ops. It’s just one more step further really. I can’t wait to see what’s going to happen next year.

Mark Lerner:

And I think when you referring to last year, I think you’re referring to the nuclear bomb drop of chat GPT onto the world, which to me is one of those moments of there’s before that and after that everything changed and I think that popular culture and science fiction kind of was the leading indicator of where these things go. People were watching Star Trek however many decades ago and some of that stuff is happening, the hollow deck, and you look at the Apple goggles that they’re talking about. I mean there’s a lot of crossover there. I think that a lot of the apprehension that exists is that along with all those very positive images, there’s a lot of dystopian concerns and stories that have been told about what the potential downside of it is, and not only from just it becoming sentient and all these fears, but I think there’s a considerable concern I’m sure amongst people, rev ops people as well with regards to their jobs and their livelihoods and becoming redundant.

I had love to talk through some of the actionable and real ways that people can be using these tools because I think the first people that really jumped onto it and kind of beat it to death, which we tend to do, was marketers, right? Marketers kind of saw expertise as a way to 10 x their writing, even if it’s generic. I think a lot of us, me in particular, can now within a millisecond tell something that was written by ChatGPT, and I think there’s still use cases there, but I’m interested to see, I don’t know your interaction with clients and customers or the folks that behind the scenes on your team that are using it, but what are some of the interesting ways that you see these kind of technologies being used?

Nico Lafakis:

I would say probably the most interesting way that I’m seeing it used is drawing cross comparisons. So being able to look at a cross comparison of data of a huge dataset that you otherwise wouldn’t be able to, it’s not necessarily the same as reporting. It is very similar to what ChatSpot is working to accomplish, and that is getting there. I will say the AI features with HubSpot, I knew this was going to be the case, so I’m very glad that it is getting there. I knew that early on it was going to be rushed and they were just going to get in there just to get in the game, and what they did wasn’t really the greatest attempt at it, but it was pretty good. And now those tools are really getting refined. So things that I used to even do, which is use GPT for either workflow descriptions or for reporting descriptions.

Now that’s built in, and again, I still was using it after it was built in because the first iteration of it was a little clunky, but now it’s actually very, very accurate. Going further than that, if you understand how to work with the models, specifically how to generalize your data and how to essentially mask it so that it doesn’t have any identity, then now you’re really leveraging some power, right? Because with absolutely anonymous data that just holds columns of values that you can then just put random column headers on. I mean, we’re talking like I’ll take a file and I’ll call it jackknife, and then it has zipper A whatever B, whatever, C, DEF as the column headers, so nothing makes any sense. And then it just has whatever the numerical values and date values and things of that nature and job titles, that kind of stuff. And we can push that all into a large language model and outcomes, my outcomes, my buyer personas based on revenue could do based on job title, could do it based on industry. I can also now pull up top 10 job titles that earn me the most amount of money. A lot of stuff that you probably could do by reporting, but then you could say, okay, of those top 10 job titles that earned the most amount of money, what is their purchase history and what are they most likely to purchase next?

You can probably report on that as well, but the amount of time that it takes you, so that’s what we’re talking about when we’re talking about these tools. Tools, the amount of time that it would take you to build out the separate reports to find out where you could query basically 10 reports worth of data in one prompt, it would take you again up to an hour or something like that to build that out. And then once again, to be able to apply that to what it is that you’re doing. So knowing that you have that kind of data already in there and then being able to query that against what life cycle stages make the most sense, how should I structure my lifecycle stages given the type of data that I’ve gotten in here, right? Assuming that these were, because you could say that assuming that these were all individual contacts or individual people, how would we structure this, right? Assuming that the business was this type, how would we structure the buyer’s journey knowing all of this information already, what would lead scoring look like? Because we also have the lifecycle stages in there, which again, don’t mean anything in terms of random data.

And again, if you randomize what it is, so if you have LY cycle stages and then you change what customer is to be dog and it’s just a bunch of dogs and you change lead to be cat and there are a bunch of cats, right? Again, you have to understand how to randomize your data and how to make it, again, like non-unique identifiers. And once you do that, mix it with whatever you want, play it against any playbook you’ve got, use it against any of your best practices, and now you have this spat out template that you can use to work with your clients.

Mark Lerner:

Super interesting. And I think that the initial kind of experience that some people might’ve had with ChatGPT – and still do – is to assume it’s a parlor trick, right? Ooh, you can write a bedtime story for your kids, which I do. It’s fun. My kids are somehow, we’ve gotten into the routine where we’re expected to come up with an extemporaneous new story, we read a story and then we have to tell this imaginary story. And so for a while we were doing it with ChatGPT, but then I was just getting frustrated with

Nico Lafakis:

In the, you want me to build you a storyteller? Is that what you’re saying? Yeah,

Mark Lerner:

Maybe. And it’s a good transition into what I’m going for. So I think a lot of us had these ideas of having our own personal assistant that could do a very specific type of work, and I think a lot of people were trying to figure out how to get Shacha be to do that, or they were doing fine tuning and all these things, which is a little bit more difficult. Fast forward to recently in the last month, custom GPTs were introduced, and I know that you had a project that you’ve put together for a very specific use case, formally known as HubSpot, Harry, I can’t quite remember what we call it now, but I would love to hear a little bit about the background of that and how you came to decide to do that and maybe some of the tips and tricks you’ve learned along the way.

Nico Lafakis:

Honestly, and it’s, let’s just call it, it was a personal sort of competition. I was very inspired and impressed when Dharmesh posted on LinkedIn, just almost randomly. Hey, I’m working on this AI bot and I’m going to put this thing in the HubSpot. I was like, oh my God, that’s so sick. I only wish I knew enough about this stuff that I was a computer scientist or whatever to be able to do that. And then came a few different add-ons, one of them, I can’t remember off the top of my head, but it was a really sick add-on to bring GPT into workflows within HubSpot, and specifically to be able to use those workflows to go pull LinkedIn data into HubSpot. So it would be this workaround method from having to either use some third party service or ZoomInfo or whatever. And I was like, oh, man, that’s so cool.

You can actually use GPT and HubSpot. And then the next step was like GPT for sheets. Oh my gosh, that’s so cool. We could do GPT for sheets data now I don’t even have to pull the data in. I could bring GPT into the sheet. This is so cool. And it’s using my API key, so it’s even more keeping it more and more private basically, as opposed to leaving it for public access. I mean, I only talk about it. I only talk about it in terms of plus and paid subscription. I don’t talk about free stuff. I don’t know what goes on there. I don’t care. So if you have plus, then you also should have applied for API access, and if you have API access, then you also have access to assistance as well as the ability to with your API extensions or wherever you’re using it to add additional privacy and to lock down and keep that history blocked so that you’re not training their model with your data.

So that kind of thing became more interesting. And then obviously the news in and of itself was more interesting, and then dev day, and when dev day happened and I heard about custom gpt, I immediately, I’m sitting there just refreshing my screen and he’s like, oh, and it’s dropping today. And I’m like, refresh, refresh, refresh. And didn’t get it until maybe four or five days later. Thankfully, I think I used GBT enough that I did get the add-ons really, really quickly. And yeah, I have just been in custom GPTs pretty much since I very rarely use regular GPT at this point. I use that almost like Google, so it’s like, okay, if I need some generic information that I need to do a search on or something like that, then I’ll use a regular GPT, otherwise I’m using one of my custom ones. And if that was a transition, let’s say from custom instructions, so there was also that transition in between when we were able to fine tune the conversation.

So now that you can fine tune the model as well as the conversation as well as the custom instructions as well as the knowledge base. And on top of that, you could add external actions through Zapier or whatever third party service you’re able to grab them from. Now we’re talking Turkey, we’re really starting to build a model that, and again, it has those restraints to it, so that constraints, I should say, that allow you to prevent the data from being stored historically or shared with anything. And yeah, it’s addictive at this point. Now it’s just a matter of like, okay, well what can I build? See what the limitations are, let’s push the boundaries.

Mark Lerner:

Yeah, I mean I’m very similar. You mentioned just something like your Sherlock Holmes kind of thing. I will find myself finding a new toy or whatever, and next thing I know, it’s like two in the morning, I’m still sitting at my desk just trying things. And so one of the things that happened recently was that I found a kindred spirit because not many people get excited about these things, but the other night I was as one does when they can’t sleep, I was looking at the trending repositories on GitHub, and the number one at that time was that somebody had been able to hack, I dunno if hack’s the right word, but the custom instructions for a lot of these custom GPTs, including the ones created by OpenAI themselves. And I think you immediately understood the significance of this. I’m not sure other people would have, but this is reverse engineering that kind of behind the scenes tricks.

Because when you’re instructing a model, whether it’s for analysis of data or connecting to an API, or just writing a sales email, if you were writing a code for a software, it’s binary. You put this function in and you get this result out, but with an ai, there’s so much more variation and you have no real idea what it is that’s changing the outcome. And there’s just a lot of theories out there. But being able to look, especially at what open AI had done in their own models and reverse engineer, that is a super interesting way of firstly understanding the inner workings, but also being able to utilize it to make these custom gpt. So I know you got super excited about it. I’m wondering if you were able to utilize that in any interesting ways.

Nico Lafakis:

It’s funny because you were talking about staying up late working on this stuff, and it’s like, yeah, I was just up until 12:30 last night cleaning data I find myself doing. It’s kind of strange and it’s kind of strange, but it’s awesome at the same time because you find yourself doing this new tech thing that you likely would’ve never touched before, never even my wife was saying the same thing that she’s like, if I didn’t know you, if we weren’t married, I would have no idea that any of this stuff was going on. So I would be completely obtuse to it. I was like, it’s all right. There’s a lot of people there that way, it’s all right. But yeah, I mean being able to, so basically what Mark is trying to severely downplay is that he found the mother of all keys, right? Because if you have been working with custom GPTs and you’ve been working with the instructions, the instructions are gold.

It’s so much more than custom instructions were for what we were doing with our individual conversations. Now these instructions are seriously driving this model. And when I say that, I mean it’s basically we’ve learned that there’s a coded method to it. So initially when you were working with custom PPTs, you’re just working that conversation and you’re just building out that paragraph. And I was playing back and forth with that and it’s like, okay, well this is kind of cool, but it’s not really even as remotely structured as my instructions were that I used to use. So let me structure them at least. And after structuring them, it got a little bit better and I’m like, okay, well maybe we can just come up with some sort of custom format that we can use. And so I was just using dashes, basically just a regular, okay, well this is a list of stuff, dashes and then colon sort of subject matter, and then dashed sub matter under it. And then I came across a custom GPT that searches for other GPTs and use that to see if there was any sort of LLM trainer, because you yourself, don’t beat your head against the wall.

We’re not the genius, come to terms with the fact that you are not the genius of this kind of stuff and you’re good because all you have to do is search whatever you’re looking for somebody. Some other genius has probably already done it, right? Yeah. So I found this LLM trainer, and that was the first question I asked. It was, Hey, is there a codified method to adding instructions to a custom GPT? Yep, here it is. And it just spat out this codified method, and it was with hashes and asterisks and dashes. I was like, alright, well this was free. I was close. It was only a third of the way there, but that’s better than zero. So implementing that night and day difference, I mean it exactly listens to the instructions exactly the way that they are in there. And so what Mark found, yeah, okay, there are all these other custom GPT instructions that have been written, but if I can’t remember the exact one that I pulled, it was the most perfect example.

It was Dolly, I think it was something about Dolly, yeah, yeah, I want to say it was Dolly because it was full, it had everything. It didn’t just have the markdown for the initial set of instructions. It also then had basically these written out definitions, breakdowns of definitions, how examples are supposed to be in there, references to other information and how it’s supposed to reference other information, the ways in which it’s supposed to understand and reference words, different words that get used, which was seriously important. That was something that I was going to be working on for the hub knowledge expert. But the only reason I had time, and I also didn’t necessarily know how to structure it was because somebody was telling me, Hey, I was looking for how to follow a record in HubSpot and I asked your bot and it didn’t know. And I was like, oh man.

And I was like, are you sure? So I went and checked it out myself and it gave me the answer, but it gave me the answer as if I had searched for the correct terminology, which would’ve been how do I follow a contact? How do I follow a company? So in other words, the hub knowledge expert is amazing, but he only knows what verbatim is in his knowledge base. If you don’t use terminology with what you’re searching for, he won’t know. And he’s very constrained to not go online because doing that would allow him to sort of hallucinate and come up with something. Maybe he’ll pull some data because a lot of the time what happens when you’re doing these HubSpot? And that was the main reason behind building this bot was I wanted something ironclad that could answer every single question anyone had ever had about HubSpot in any way whatsoever.

I want it all because even chatbots don’t have, and this is how farm going with this thing, chatbot doesn’t have the entire knowledge base, it doesn’t have the entire academy and it has nothing from the community. So yes, I am planning on pulling community data at some point to solve problems. Essentially that’s what I want to take is the solutions to problems from the community and add those to Hub Knowledge because those are also questions that people ask. But yes, going back a little bit was just, yeah, the key there was learning how to write those if and or statements for words so that I could now increase the ability for Hub knowledge to be able to discern between how do I follow object versus contact so that it understands how do I follow is the question and to go search for that as opposed to how do I follow Object, right? It was looking for the exact whole thing. So it is, it’s seriously fun playing around.

Mark Lerner:

It’s like the Rosetta Stone, right? I mean there was some basic formula there that I hadn’t known before that I think is super interesting just for the folks out there. The way open AI seems to have structured its instructions are roles and goals as one section, then it was constraints, and then the last one was personalization. That was the way it’s structured. And in their instructions it said, do not refer to these names in conventions, but walk the person through. This was taken from the custom bot that’s supposed to help you make your custom bots, and it’s guiding this bot to extract the necessary information without divulging the formula. And so there’s so much that can be extrapolated from there, but I think that for the more technical minded folks out there, and I think a lot of ops folks are like you and I, they are maybe not professionally developers, but they’re developer interested and so they learn how to hack things together. Certainly I have, they work with APIs and more basic things, and I think that there’s just this open palette for them to adventure with these custom gpt because beyond the instructions, there’s also custom actions where you can write out and connect it to APIs and it can do all sorts of really interesting things that can basically connect into your workflow, whatever tool you’re using. Have you had a chance to mess around with those custom actions and do anything there?

Nico Lafakis:

A little bit. I’ve played around with it, I set up a Zapier account and played around with the ability to essentially write. That was one of the things I’m sure a lot of people have noticed prior to a lot of this coming out. I was on LinkedIn constantly and I was posting about news constantly. I was able to, and I had the time and there wasn’t necessarily anything massive such as this that took up so much time. Prior to that, there was mid journey, so it was like every time there’s a bomb dropping, I usually walk away for a little bit, see what it is, and then come back. And this time for the last, I don’t know, five to six weeks, I don’t have the time. I really don’t. So the first thing I was thinking of was like, okay, well I need to create a bot that has an action so that I can write, do my write-ups in GPT and then auto-post them to LinkedIn. And so I am halfway there. I do have the action set up, and so I’ve got basically GPT set up, I’ve got LinkedIn set up, and then I also have Gmail set up just in case. I don’t know when I will actually use it. And I definitely want to work with it more because there’s a lot to do and learn on the Zapier end with your output from what’s coming in from GBT.

And that’s one of the reasons why I’m not using it currently to do a lot of posting because I have to work with that. I need to take more time with that and learn how to format the text within Zapier so that when it goes into LinkedIn, it’s actually formatted and it’s not just one big wall of text. I had wanted to, which I thought was going to be awesome. I had tried that bot in particular has all the modalities to it. And so the ultimate goal was to have it not only create the summary of the article, but to also create the image for the article and post everything all together to LinkedIn still not working, but we’ll see. That’s the goal. But yeah, I definitely, I love actions and I love what we’re currently capable of doing with them. So the thing is, I would dive deeper into this stuff, but the progress is happening at such a pace that most of the time what I feel like is that okay, I jump in about waist deep and by the time I turn around the new things out, so I kind of have to get back out and go jump waist deep into the next thing.

Mark Lerner:

So there with you, I know exactly what you’re talking about. And then there’s already someone else out there that’s figured it out and you’re like, there’s no time to catch up because it’s always something new.

Nico Lafakis:

And in the beginning it was, in the beginning it was all about prompt engineering and who could create the best prompts. And I feel bad now for so many people who spent so much money and threw away so much money on so many classes and add-ons and all this other kind of stuff. Initially even I looked at an add-on, but it was free. And so I was like, yeah, look, this add-on’s free, so if you don’t know how to write prompts, use it. But then GPT got the ability to go online. So with that, it’s like, all right, well I’m just going to give you the URLs to open AI’s page on how to write prompts on this page, on how to write prompts on this other page on how to write prompts. And then you’re going to start writing prompts for me, and this was months ago, so I haven’t written a prompt in months.

I don’t know what that is. And if I really needed to, I would just give my idea to GPT and ask it to format it in the most efficient. That’s all I need to know. Take this query and turn it into the most efficient prompt possible. That’s it. That is the only prompt you need to know because that’s all you’re trying to do. You’re trying to take your question and turn it into a really efficient method so that GPT gives you a really efficient output. Well, alright, now that it’s auto connected to the web and its sources are updated to 23, I’m sure that by April, I think of 23 or something like that, I’m sure that they’re going to get it upgraded even further than that by the end of the year. So this thing’s probably going to be on six month rotations in terms of when it’s been updated for information and given that that’s the case, it’s like, all right, well now it definitely knows how to create prompts on its own without even needing to go search for it. And on top of that, now we have a custom gpt. So if you go, just go on Google and search for a custom GPT that writes prompts, boom. Okay, so prompt engineering within a year is dead. I don’t understand that job title shouldn’t even exist

Mark Lerner:

Anymore. It was one of those sexy words that people thought was really interesting. And a lot of people, a lot of the course industrial complex as I’ll call it, glommed onto it, before they were selling crypto courses, then they were selling these courses. And I think there’s a lot of fads and it’s moving very fast and you just got to take a step back. So as we wrap things up here, I want to just get your view on where this is going. 2024 is coming. What do you think the average rev ops person really quick is going to be using these models in their day-to-day workflows?

Nico Lafakis:

Okay, so here’s the thing, and this is what makes it a little strange. Everybody thinks that GPT is coming for them. It’s not GPT. You’re looking in the wrong direction. And this is a weird thing, but stay with me folks. Back in the day when I was first starting to really get into serious online gaming, of course the game that I picked was World of Warcraft. And so playing that out kind of gave me this sort of roadmap to real life, really, because that game sort of echoes a lot of what happens out here. So realizing that it’s sort like, okay, well where were we are going in terms of what we’re doing is we’re heading into this sort of isolated space where our assistants are kind of becoming close to our best friends for a lot of people already. And when I say that, I mean in a nuanced way, but it’s also helping out in terms of our programming.

Well, back in the day when people were programming stuff for World of Warcraft, when it got really good world of Warcraft, just kind of absorbed it and just made it part of their programming. And if you go play the game today by comparison to what it was like 15 years ago, everything that people were working really, really hard to make into modifications to add to the game are now standard part of the game. So you almost have to add no modifications, which means that the people who used to program that stuff, it’s like, okay, well you’re essentially, you’re out of a job. That is really the way you should start looking at hubs.

The larger programs always absorb the smaller guys, right? So,  you’re watching what GPT is doing. A lot of people talk to me about JASPER. 

And I get this question probably once a week. Should I get Jasper or should I get GPT?  And that question, the answer comes down to, well how good are you at writing? Basically,  if you are not good at writing at all,  then get Jasper for a little bit and then you’re going to learn what you need to learn.

And then you’ll probably switch over to GPT.  If you are good at writing, don’t even think about Jasper because you are just buying templated stuff that you already know how to do. And it’s really not going to help you that much. And the sort of CRM-ish, marketing-ish type  direction they’re trying to go in, I don’t think it’s going to help them stay afloat, but it is what it is.

Mark Lerner:

Yeah, they had a leg up, I think, before ChatGPT, they were like one of the early API partners. And so yeah, once ChatGPT came on, I still struggle to understand their competitive advantage and their value proposition. I don’t know, not to cast aspersions on anyone. 

I can’t see myself utilizing any other AI besides ChatGPT, except when I reach the maximum, which I do constantly. I either use my sidekick on Microsoft Edge, use Entropic  or just go back to the API.

Nico Lafakis:

I fall back to my API version immediately if, uh, if I actually max out tokens, but, um, yeah. And then so like looking at how that reflects in HubSpot, right.

And why I was saying like, don’t look externally.  Pay attention to these betas, guys. Pay attention to what’s rolling out. If you didn’t, if you didn’t pay attention this past week, guess what rolled out?  

Auto Report Creator.  Right? So,  you know, now that aspect of what you do is out,  right? And the only reason I say watch out is mainly because a lot of people spend too much time  on tools and too much time on building and not enough time on theory and not enough time on best practices.

Right. And execution. So you can be the greatest at knocking out assets all day long, all year long.  

That’s who’s going to get replaced, because creating assets is what these bots are going to do. It’s what this AI is going to do. It’s what it’s meant to do. It’s meant to save you time.

And in the world of RevOps, what eats up all of our time is building out the assets that we have spent all the time mapping out with our clients. We know what we’re going to have to do. Now we just have to do this crazy lead work of building out the same workflows again, the same forms again, the same lists again, tying it all together and all that kind of stuff.

Right.  So  no, no differently than having native plugins that essentially destroy the purpose of. Eating up hours and hours of time doing an integration for somebody.  So too, are you going to have this tool slowly but surely taking over? I mean, it’s already email assistant,  subject line assistant, blog assistant, blog title assistant, right?

Like everywhere you look, that little writing assistant is pretty much everywhere in HubSpot.  

Even for the AI metrics that are now starting to spread everywhere. Over your forecasting, over your goals, like pretty much everywhere within your portal, uh, it’s only going to get more and more. So the focus needs to come back to what, you know, I was saying this earlier this year, since.

The technical aspect of what you do on an everyday basis. And this is everybody. It’s not just whether you’re in marketing, web dev – that does not matter. Since the technical aspect is about to go away, start thinking creatively.  

Creativity is the new job skill  because all the hard work is going to go away.

It’s going to be gone in like the next. Five years easily.