The tech update for muni industry

Electronic trading and AI are expected to play a more prominent role in the muni market in the next five years and beyond. The municipal market is ripe for growth in these areas and our panelists will discuss how participants can integrate technology to help create a more efficient marketplace. Additionally, we will spend some time examining the electronic financial reporting and the Financial Data Transparency Act requiring all state and local governments to file electronically. 

Transcription:

Abhishek Lodha (00:09):

All right we're going to jump right into it given time constraints, but firstly, I want to say thank you to all of you for joining us on this tech update. This is my second in a row and it's good to see there is a cadence in the Bond buyer and many conferences I've been to in the last few years where technology is spoken about a lot more frequently than it used to be. I also want to thank my panelists here and I'm just going to jump into introductions. I'm Abhishek Lodha, my title actually has changed but I'm now the director of FinTech strategy at Assured Guaranty. A lot of my focus is now on building new FinTech ventures for our company and thinking about how we can enable and empower our analysts as well as all market participants with better technology and data on the panel. With me today, I have, I'm going to start in the order. We're sitting with Jeffrey Previdi from GASB. We've got William Kim from Muni Pro. Then we have Bill Buzaid from Tradeweb Direct and Gregg Bienstock from Solve Fixed Income. I'm going to call you the diver guy. Everyone knows Greg but why don't we just quickly start with introductions and really starting to unpack on what aspects and technologies in the market you're working with. So Jeffrey, why don't you kick it off?

Jeffrey Previdi (01:44):

Sure. So I'm the vice chair of the Governmental Accounting Standards Board or GASB. We are the non-profit entity that sets generally accepted accounting principles for state and local governments in the US. So everybody is probably familiar with our sister board, the Financial Accounting Standards Board or GASB, which does the same thing for public companies. We do it for state and local governments. You might be wondering why there is a GASB guy on the tech panel but we do have interest in technology in a number of ways. One of the ways we'll talk about it today is something that we've been sort of monitoring for a while, what we call electronic financial reporting. So we set gap standards for governments. We are interested in how Gap is consumed, how gap information gets consumed regardless of how it gets consumed. So traditionally that would've been on paper, then it comes PDF or something like that. And now we're contemplating various technical ways to consume that information. Our interest is that information remains relevant and valuable for the users of those financial statements.

William Kim (02:54):

Nice. Thank you for having me. Will Kim, founder and CEO of been in the business for 17 years, but we started pro five years ago. We're an AI enabled data and productivity platform. So we take all the PDFs and stuff that Jeff was talking about, extract all that information and have it completely usable for market participants both on the sell side, investment banks, as well as FAS and issuers. So thanks for having me.

Bill Buzaid (03:21):

Hi, I'm Bill Buzaid. I run buy-side sales for Trade Web Direct which is focused on the retail and municipal fixed income markets placed specifically on the trading side. And in addition to that we've moved into the data space, getting into evaluated pricing. So that's my focus.

Greg Bienstock(03:40):

Hi, and I'm Greg Bienstock. As mentioned our company was, or still is Loose but in the diver platform. We were acquired by Solve Fixed Income. Solve is a great data provider in the broader fixed income space. Our diver platform really is serving the muni continuum, starting with a debt analysis debt map, debt service calculations, our new issue pricing platform the 15C to 12 analysis and related work that we're doing in that space. Our secondary pricing tool is different than evaluated pricing, not at all in that category. Our muni trade ticker and then retail time and trade disclosure. So we're kind of serving a broad base of constituents across the muni spectrum that we have done as we have over the years and continue to do that. And then we'll be augmenting a lot of what we do with solve quotes and they'll actually be leveraging our technology as well and other asset classes.

Abhishek Lodha (04:38):

Great, thank you. So the way, at least I think about technology is you've got the data component and the analytics and the workflow component, and as you can see, we have a really diverse panel who are focused on very different aspects of our market. So let's start with this discussion and topic around data. So it's an age-old problem. We keep talking about how fragmented our market is in terms of data needs and any sort of analytics that you guys work on needs could draw data, which is systemized. And this is a national Outlook conference. So let's talk about what your outlook is, where is data headed? What are you seeing in terms of availability, accessibility, and how do you think about it? So Greg, maybe we start with you again.

Greg Bienstock(05:23):

Sure, thanks. So I guess the first point is the good news is compared to when we started our company 13 years ago, is there's a heck of a lot more data out there. The challenge is making data into information and this is something we all struggle with. The more data that you have it's supposed to make us more efficient, but if it's not something that is usable, actionable information, it's not going to be. So when we've looked at this problem over the years, we looked at the idea of starting with making sure that for our sector, which is important, our sector of the universe is there's an a obligor database and the a obligor database is key because you have the ability then to map every issue, every issuer, every bond to an obligor, and then you also have the ability to geolocate, which allows you to attach economic and demographic data as well to that. So the idea is to create my partner in crime Tim Stevens hates when I say this, but it's essentially you create a bathtub, but you gotta keep the bathtub clean and so that the data can all be used and talked to each other. So I think that's really critical, especially as we have more and more data available to us. And then it's basically taking that information and then being able to utilize it either through your own proprietary software or in partnership with other organizations. So for example, we have an integration into the trade web. We talk to other providers about how we can integrate and create for all of you a single, ultimately single type of interface where you're able to access this data in the most useful fashion to each of you. We all have too many screens, so it's about limiting screens there.

Bill Buzaid (07:00):

I think. Yeah, I think that's a great point. And fragmentation has historically been a huge problem. And so right now it's all about aggregation. So in the secondary markets participants across the spectrum, whether you're talking about broker dealers, you're talking about asset managers or you're talking about the retail wealth management community are looking to aggregate liquidity, aggregate pricing. And the challenge there is that not every counterparty is open with one another. So one of the interesting things that we're seeing across the board interest on both the buy and sell side of the equation is being able to deliver aggregated market data, live actionable market data to all of the various constituents that are out there on both an anonymous and disclosed basis and attributed basis. And that's really the challenge at the end of the day, since not every counterparty is open with one another, you want them to have access to that information. And so we're able to serve as an intermediary and provide that data to them in an anonymous fashion, but they also want to receive that live market liquidity and quotations from their established counterparty relationships in an attributed fashion. They want to be able to look at that in basically a centralized location. And to Greg's point, there are many different ways to accomplish that. And so that's one of the things that we're working to solve and that a lot of industry participants are looking to solve right now. And that is delivering data aggregation to folks, whether it's via an API into proprietary systems that they have or via a user interface that is delivered by trade web or by any of the other third party vendors that are out there and available. Will, what do you think?

William Kim (08:50):

So speaking from the primary market and continuing disclosure side, when people talk about data fragmentation, what they're really talking about is, Hey, it's stuck in a pdf. I can't really get use it, right? I've got to have analysts read it and maybe copy it down. The state of the art of the technology today is you can use AI enable platforms at every step of the process to extract that data, clean it, verify it so that you don't have this problem anymore of garbage and garbage out. You can have completely vetted data. And the way that happens is using AI, and I use AI as a broad term, we can talk about the technology in detail later. You can take what was a fragmented market of unknown sources and have a fully detailed verified set of data, whether it's a debt profile or trading data on a per credit basis. So you can even go deeper than an obligor. You can go deeper than a single Qsip, you can do PPUs. We've certainly done that for all of our clients. And so I think there's a new wealth of technologies coming out there that we can use to address this problem.

Jeffrey Previdi (10:02):

So from our perspective, when we think about electronic financial reporting and we talk to various market participants, one reality is we have a fragmented market. That's why we have fragmented data. We have states with different rules, we have sectors with different norms, we have issuers with different levels of sophistication that makes the data obviously hard to put together, why it keeps guys like William in business. But we also find when we talk to people that the data gathering aspect of it, collecting of it, particularly as it relates to financial statements for example, is still pretty manual because technology based solutions tried continue to be tried maybe getting more successful, but because of that fragmentation, because of those differences, it's been hard to put together data in a comprehensive way. So that's what our interest is and making sure and seeing incremental gains in that. But it's been slow in our market.

Greg Bienstock(11:00):

So I think William just brought up a good point. When we start to talk about AI, and I know we're supposed to talk about that a little bit later, but Just Blew my agenda, AI, I think we had agreed we're not going to get too deep in here and bore you guys to death, but AI is thrown around. Machine learning is thrown around as though it's this panacea, it's everything's going to be wonderful. At the end of the day, it's technology that's still learning. And some of the challenges that we face as an industry, and we'll get into the Financial Data Transparency Act is locked PDFs and trying to pull information from PDFs and then what tools are out there to be able to extract the data from PDFs. And I know I'll go back a few years ago when we started looking at that, it was like, okay, it's 88% effective, it's 92% effective, it's 95% effective, but you have to do this verification. You still have to make sure it's right at the end of the day because if you miss a zero, it's a big freaking deal. And so when I think about what AI is, I think about how we use it in our broader organization now it's training models and the amount of data that's in the marketplace is, and the volatility in the marketplace is actually really beneficial to training models. And so just as I think about AI and the different uses for it and how it gets thrown around, there are plenty of really great aspects for it. But will, I think your point was spot on, you can use it, but then there's a verification component that's highly critical.

William Kim (12:34):

I think that's the case. And the benefit of technology is getting from that 88 or 92% to 99%. And so what's the benefit of AI to the end user? No one really cares how it comes about. They care that it's right. And the benefit is scale. Instead of spending analyst time going through all these PDFs, you can do the entire universe of all PDFs in a day. And that's what AI affords us. It gives us that scale that we can leverage to address the whole market. And that just wasn't possible before you could have hired a million analysts, it would never happen. And I think I don't want to blow up the AI agenda.

Greg Bienstock(13:14):

No, no, that's fine I'm sorry.

William Kim (13:15):

I'll segue into maybe legislation has happened, which has forced a certain group of muni market participants to do a ton of work. We talked about the FDTA or Jeff mentioned the FDTA. I think all of that can be in theory done with technology. However, it'll take careful consideration when putting that into place because I think technology is great, but it cannot eliminate all of the liability questions around requirements and regulatory requirements. And so that's I think something we need to address.

Abhishek Lodha (13:49):

And before jumping on to FDTA, because it's obviously the topic of the month coming back to the role of technology, and you guys brought that up and I've spent six years educating people on the role of technology. It's not at Replacer, it's an enabler. And that's how we should as an industry be thinking about it where we have 50,000 issuers versus 5,000 in the corporate world. Oh, just forget about fund accounting and the number of funds you have as an issuer from a credit analysis standpoint, a million QSIP's out there, by the way, 1% of that are liquid, maybe more now 11% really traded in the market so electronically. So there's a lot of the role of technology is how can we enhance and enable our analysts and our market participants to go to that last 20% and focus out there instead of doing all those mechanical steps in our world? And I have my wonderful underwriters here at Assured, and we have a very business, we look at 2,500 new deals every year. And out of that a huge chunk of the revenue and premium as in financial guarantee comes from a very small number of deals. So we are providing market access to a huge number of these small issuers. And we don't want our analysts to be data wranglers. They should be analysts. That's what their job is. So how can we do a better job of a, and this is something that Greg brought up, right? It's an availability question when it comes to data, but there's also an accessibility question. Data is now a lot more available than what it was five, 10 years ago. And now with the right tools in place that have democratized that data aggregation, stitching, all these different types of data sets has become a lot easier for people. So that again, just it's incremental improvement as we do it, but it's just about enabling analysts to, or any sort of market participant to just be able to do their job better or look at a broader spectrum of the market than what they're currently used to. And I mean, I appreciate demystifying AI as well because I feel like that's another place where with chat GPT making a big splash I ran a report saying, give me a credit report on the City of Chicago. And it gave me a nice fancy report, which I was showing a few people around, but it has some capabilities. But to that point, again, with the fragmentation that we have, AI is not there yet to be able to replace humans. I think it's a great enabler. Again, it'll help source the right pieces of information because you're talking about 300 page cafs and OSS that an analyst has to go through. How can we make that process a lot better than what it is today? So that's very helpful. So jumping on to the Financial Data transparency Act, it's making a big splash. So again, I'm going to go on that same word, demystification of that. So Jeffrey, I'd love to hear your thoughts on what it means. What are the implications to the market, what the timeline looks like?

Jeffrey Previdi (16:50):

That's a tall order. So the federal government's response seems to be the financial Data transparency act to this transparency issue. So I've been charged with giving the two minute overview for those not terribly familiar. So this is a legislation passed by Congress signed by the president late last year. It is for eight financial regulators altogether among them, FDIC, controller of the currency, federal Reserve, SEC among them. And these regulators are to create data standards for what's termed collections of information that are submitted to them under their jurisdiction. So it's not about new information, it's about how the information is to be submitted to them. So these agencies then are within two years to promulgate joint rules. So joint rulemaking among all the agencies about meeting the specifications outlined in the act. So among those specifications are rendering data fully searchable and machine readable to the extent possible. So that joint rulemaking will be followed by regulator specific rulemaking. So each of these regulators will go off and do their own rulemaking. The regulator for the municipal securities market is the SEC. Those joint rules are supposed to be adopted two years after the act has been passed. So December, 2024 regulator specific rules are to be effective two years after that except for the muni market where the regulator specific rules only have to be issued two years after the joint rules. That allows the SEC more flexibility for implementation in the muni market. And interestingly, the SEC is required under the law to consult with market participants in establishing these data standards. It's the only section of the ACT where an entity is required to consult with market participants. So when we think of one thing, I have to step back and say, okay, GASB's interest, right? Financial statements, remember collections of information can be a whole lot broader than that. That's anything that's submitted to the MSRB, which of course is financial statements, but a whole lot else. But in thinking about financial statements, our interest is making sure that there's nothing that's going to dilute the relevance of the gap. Whatever data standards get created, that gap as we see it, as it is done today, is not going to be weakened by this exercise of creating data standards. And additionally, we stand prepared to work in a constructive basis to assist in the implementation of the act. It's worth noting that our sister board, GASB, is the actual maintainer of the data standards on the corporate security side. So they oversee the taxonomies, they put out an update of the taxonomies each year, and they are the maintainer of that. So GASB's stands ready to contribute however it sees however it's seen fit in our market.

Greg Bienstock(19:47):

I was going to say we're going to be hearing about this for a long time. So as Jeff just said, right, it's four years at least as it's written right now be before the rules will be completed and then there'll be an implementation period as always. And for those that aren't aware how this made its way through Congress, it was added onto the defense spending authorization Act really fits right there. But Congress got that through there. I do think conceptually this has been talked about in our marketplace for a long time, the inability for folks to access information, the compatibility or lack thereof of data. Jeff, you just raised a really good point, right around the taxonomy, right? Whatever's done in the first two years, right before the actual rules have to be made. The fact of the matter is that absent a taxonomy that market participants, and this is where the participation of everyone becomes critical, absent something that people agree on it's going to be incredibly challenging to do this. Putting aside the ability or inability to extract data, I think actually that that's actually less complicated. At the end of the day, it can be done. So I worry a little bit about that is a taxonomy overall is a taxonomy on a sectoral basis. And then we all have to agree on what the sectors are. We map to 27 sectors. One of our competitors has 42. It's how deep do we go and how does this information really become that much more useful to us? I guess the other concern that I have here is on the good side, it's full. They want it fully searchable. They want it machine readable. Awesome. They want to create data standards. Great. That's what we need. And then to do this, there also is, it's referenced in what was put out there that there has to be a use consistent with accounting or reporting principles. And so we now start to get into the challenge here. As we know we have in our world here, there's something called the Tower Amendment and where you have no underlying regulatory requirements for data submission, which is we have in the muni world, and there's no requirements for adherence to specific set of accounting or reporting principles. We have this issue of reconciliation with the tower and that's real. And so there is a huge amount to unpack here. I am excited. This is weird, right? I'm excited about this but I'm excited for the next few years. See how this unfolds. You are too, right? Yeah. So I think the other thing that I'll just throw out, and then I'm going to be quiet and pass this back to my colleagues here is that's a little concerning here. And this is not touting any organizations in particular, but I think the most innovative things that have come to our space over the past 15 years have been from the private sector. I think Emma is good. I think there are issues with it. I have always said I wish we had the budget that the MSRB has for technology. We could do a lot with that but push that aside. But I'm really concerned because when what's written here, it basically says that it has to be non-pro proprietary, whatever is developed and it has to be open source. And I think that creates limitations on organizations that have funding restraints in their ability to engage in this space. And we're a relatively small firm will. You're a relatively small firm. You guys were a little bigger and well, you're different but I think those are some of the challenges that we face. So I'm excited about it, but at the same time, I think there are lots of challenges and I think everyone's participation around this is going to be really critical you all as market participants especially.

Abhishek Lodha (23:42):

Yeah, and I think that's important from a technology standpoint and from a data schema perspective, as you said, right? There's this challenge of taxonomy, right? And it's question for the panelists out here is we talk about standardization, but then we also want every little detail and flexibility as we're building out these taxonomies, right? That's what I'm reading, at least when I think about FDTA and what kind of comments are coming out there is, oh, we're going to standardize it, but we are going to add flexibility and customization. Doesn't that defeat the purpose? Then how do we swivel between this too rigid as a methodology versus too flexible and find that middle ground? Is it by certain sectors that you see or is it just going to be based on certain states?

William Kim (24:34):

I don't think that the corporate side has even figured that out, right? Jeff was talking about how GASB has their whole corporate side built out on XBRL and interactive XBRL, and even there, they're conducting studies on how effective this legislation and implementation has been. They can't differentiate between Microsoft and Macy's, right? It just doesn't fit even in XBRL when you're comparing against all their different data elements. So for them to expect the Muni market, which is 10 times or a hundred fold more fragmented in terms of the small cities and states that make up this country to be perfect when they can't do it for Fortune 500 company just seems like a tough task.

Jeffrey Previdi (25:19):

Yeah, I mean, I think it's going to be something that the consumers of this information, the ultimate consumers as market participants are going to have to think about. And I know when we do financial standard setting comparability across governments is an important consideration for us. And so when you're thinking about that comparability, are you thinking about it in a world with one taxonomy, with a whole lot of lines? Are you thinking about it in the line of 50 different taxonomies? There's one for Texas counties and one for Utah school districts. Is that the kind of comparability you want? Or do you want something wider than that? Then you run into that issue of customization and allowing customization. The more customization you are, the more it's hard to compare. And again, so what's the objective of this information? What do users of it want? And I think that's going to be very important to understand before deciding upon what sort of path to go.

Abhishek Lodha (26:15):

And in your experience, what have you heard market participants say? What is data when it comes to FDTA?

Greg Bienstock(26:23):

Jeff, I think you hit a good point. It's like what's the purpose? And if you boil it down to the simplest, if it's just giving us all access, so it's go back to the point that it's fully searchable and it's machine readable and if that's the end all, then perhaps that kind of addresses the question of taxonomy further. But I don't think that's the end all, and I don't know where it ends up. And again, this is where, you know guys already have the taxonomies yet. How do we get the market to accept it? And this is where I just go back to the point I made earlier the participation, and Mark Kim was up here earlier and he talked about comment letters. And so whether it's comment letters or participations on committees is, but it's getting perspectives out there so that the people who were ultimately responsible for creating the rules that we're all going to bitch about a few years from now that we've had a chance to put our input in there and express our opinions and thoughts as to what can and can't work and why.

Jeffrey Previdi (27:28):

And I think what we're hearing, I think the reality is that consumers of this information are having to do it a certain way. Now either they're making the choice of they're doing it themselves or they're going to a vendor to get it but they're having to make that choice. And the comparisons that they're making are the comparisons they're making. But now when we're all thinking about a uniform standardized way of doing it now, and you want that to have value for you now you're going to have to be part of the participation here so that the decisions can be made. What kind of system we going to create that will benefit the greatest number of users?

Abhishek Lodha (28:04):

Yeah, it's going to be creating that common language, as you said. Right? And I've worked with enough analysts to know that no two analysts look at the same credit the same way, which means how deep do they dig in terms of data? We're talking about structure, financial tabular data, but what about the notes of the financials? What about other information stuck in MDNA's is there's so much out there and we obviously have a lot more data being put on credit analyst plate in form of ESG, which again is a whole different discussion we have this afternoon. So I'm not going to go too much into it, but how do we as a market decide on what that common language is and come to agree upon?

Jeffrey Previdi (28:49):

Righyt and you raise a good point when we think about financial statements, if the financial report has a lot of pieces to it, there are the financial statements, there are the notes, there's required supplementary information. So when we're thinking about gap information, where accuracy is number one and completeness is number two. So we think gap is valuable for the market. And so what is going to be created here? How much of gap is going to be pulled in? How much of that is part of this collection of information? We think there should be data standards for all of it, but that's that's going to make it more complex. So it's these trade offs that we don't have a whole lot of time as a market to decide, but I think we're going to have to decide and hopefully come to something that will prove valuable.

Abhishek Lodha (29:33):

So switching gears and moving towards other aspects of the market. So we spoke a lot about data, love to hear more about workflows and analytics in this space. So I'm going to change around the sequence, but I'd love to start with trading and secondary market. And Bill, I'd love to get your opinion on how has trading and online execution evolved over the last few years and where it's headed. The last time I remember reading a study was someone mentioned in 2022 that 11% of our volume is electronically traded. You've probably read the same study versus corporates where it's 41% that's come up from 19% over five years while we've kind of stayed static. So where do you see it moving in the next few years and how has it progressed so far?

Bill Buzaid (30:22):

No, really good question and a good thing, I'm just as excited about trading as Greg and Jeff are about financial reporting. So this is a fun topic for me. Listen, the pandemic served as a huge catalyst for the electronification of markets, broadly speaking. It was definitely an inflection point. And even for the secondary muni trading industry, which had lagged rates and credit substantially we saw it as a way for firms to all of a sudden jump in with both feet and adopt technology. And how do they go about doing it? I mean, there's a few different ways to go about doing it. You can build it yourself. You can go and find third party vendors that provide it to you but ultimately, regardless of how you do it, and you can do either of those, what you're looking to do is tap into the network effect historically in a voice business which secondary muni trading was, you're not getting that network effect. And so now firms, whether you're talking about retail wealth management firms, you're talking about the largest asset managers in the world, or you're talking about broker dealers they're leveraging technology, whether it's ATSs like trade web or they're building out direct APIs between various counterparties. We're seeing those proliferate. Now the other thing that's important in all of this is that the liquidity providers in the space are starting to deploy algorithms. And if you look at the top liquidity providers and secondary muni trading just within the trade web direct ATS the top 10 of them, I'd say 75% are deploying algorithms in some way, shape or form. And again, it may not be entirely algorithmic trading, but they are leveraging algorithms to enhance the outcomes of the human traders that are there. And it's had a marked effect on liquidity in the market. There's an increase in the number of bids that customers are receiving on bid that are going out there. You're seeing spread compression start to reappear, obviously 21 in that ultra low interest rate environment. It was particularly acute. But you see that even amidst the volatility these algorithmic players are able to stay in the market and keep up with fast moving markets much more so than a human will. So to your point about technology and algorithms and artificial intelligence being enablers, not replacers, I think we've certainly seen that and they've made for a much more efficient marketplace. I think to talk about two trends that we're seeing. The first is people are dying for data. They're vociferous consumers of data, whether you're talking about the sell side or the buy side. And so that's an acute problem that people have. We are certainly trying to service them as best we can in that space by delivering the data to them in any way that they need to consume it. And also by leveraging AI ourselves, obviously given the robust marketplace that we have, about 20% of all trades reported to MSRB are flowing through our platform, we're able to deliver them key insights in virtual realtime that other folks can't. So the consumption of data and being able to leverage it is huge. The second thing that we're seeing on the asset management side is sort of a divergence in the outcomes in terms of the firms that are adopting technology versus the firms that aren't. So the firms that have put their best foot forward are investing heavily in technology are able to automate a lot of the processes that they have, whether it's portfolio optimization, whether it's trading of odd lots. And as a result of that, they're able to start not only delivering better returns to their clients because they're able to deploy capital faster than their counterparts, but they're able to start charging less. So we talk about spread compression in the marketplace itself. There's also fee compression when you look at separate accounts. And someone on an earlier panel was talking about right fund outflows and some of that money moving into ETS. A lot of it is going into separately managed accounts as well. And those SMA providers, given the sheer number of accounts that they're managing need to deploy technology to have success. And you're starting to see this divergence of firms that are deploying technology or having better outcomes than firms that are not.

Abhishek Lodha (35:08):

Yeah, and that's an interesting point. I think that the fee compression point right now, you have large asset managers charging single digit based fees for managing this money in a market. Whether you have so many midyear and small asset managers, if you don't think of technology to help enable you or democratize that access of workflow, you're going to get killed. Definitely. Sorry, I want to make sure, Greg, you getting your comment?

Greg Bienstock(35:36):

Yeah, just on the secondary side. So one of the things that we're doing now as part of solve and one of the capabilities that they have, in fact it's how it's been built, is their ability to take unstructured data i.e messages that are out in the market, be able to parse them and to create structured data out of that. And it's cross asset classes. And so one of the things that we'll be adding in relatively short order to our secondary pricing platform and our muni trade ticker is not only to be able to have real time trade activity flowing through there, but also the ability to access bids and bids wanted and getting a sense of depth to market. So I think to your point, there's this insatiable desire for information and data and it's just different ways to do it. I think the other point you mentioned when you think about the buy side and portfolio management I couldn't agree with you more, right? The demand for tools from certain firms and others who have a, I guess an different view of how the world can operate, but for technology is pretty important. The one piece that I'll say that I think you kind of danced around a little bit, but is this whole idea of relative value as these firms are looking for where to deploy capital, and it's having the ability to make that assessment based on real time market information and the ability to do that is highly critical.

Abhishek Lodha (37:02):

And especially at scale, to your point around SMAs. Now you have, there's the shift of retail, which is turned into this SMA paradigm in the market, which is growing. And you have the ETFs market, which is significantly growing as well. How do you deploy capital at scale? It's the way we think about it. It's a very complex mathematical problem. At the end of the day, how are you aggregating different compliance rules and requirements across different accounts, part of that same strategy, which have its own rules, which all flows into what's available in the market. So, one question I do have for you, the, you mentioned earlier liquidity providers. So can you shed more light on what broker dealers and liquidity providers are doing, right? Because electronification of offerings, like the fixed connections that we've spoken about for years was not as active. So what's changed for smaller entities especially?

Bill Buzaid (37:54):

Yeah, no, I think it's a great question. Obviously as I was saying, there's been a proliferation of algorithmic players who've who have provided meaningful liquidity and they are ingesting a ton of data. So I think that is the biggest trend change. I think one of the challenges that those firms face is yes, they're all, a lot of them are looking at the same data. Most people are using some version of the MSRB tapes supplemented by their own decision making to decide on things. So there are a lot of different data products out there. One of the things that we've been doing is working on an AI price that it is going to do a better job of approximating a real-time valuation for a bond. You made this point abec earlier about how big the universe is and how little of it trades. It's about 2% of the universe that's trading on a given day. So you don't have a lot of realtime data points there. And so we're trying to solve that problem and deliver an AI price that can feed these algorithms and can feed even folks who are using out of the box solutions from other providers such that they can respond. And to your point, Greg, about relative value, participate in democratize. The other key point is, even though the top players in the space that are providing liquidity are deploying algorithms, one of the things that technology has done is it has democratized liquidity provision. You have small registered investment advisors and retail financial advisors that now are able to provide liquidity, meaning they're able to respond to requests for or bid wanteds that come out on securities and helping them price those so they can make those intelligent decisions and make those relative value decisions is allowing them to buy bonds on the bid side, whereas they used to have to buy 'em on the offer side. So really exciting to just see that transition

Abhishek Lodha (39:52):

Before jumping into primary market. I'd love to also get an opinion if anyone has one on credit analysis. So we spoke about portfolio management, we spoke about trading. How is technology has open to everyone? Has there been any change in technology around credit analysts and how should they should be thinking about it beyond what's happening with FDTA now? So Jeff, if you want to add?

Jeffrey Previdi (40:14):

Yeah, I mean from my time on the board and prior to that as a credit analyst, I think it did come down to the issue from a workflow perspective of getting the right data into the screens of the analysts and right data. I put in quotes, right? Because it comes back to the data issues we talked about before and the challenges of those. And from the conversations I have, it still is a little bit more of a manual process and a field process than people wish it was, right? And so I think the interesting thing is, will Fdta make that an easier will? What results from FDTA make a difference? Or are we still going to be in our analytic box making our determinations the way we made our determinations before, the benefits that could be there of effeciencies.

Abhishek Lodha (41:05):

What we call internally the swiveling, right? As an analyst, I'm looking at six or seven different sources of information. How do I reduce that to two or three so that I'm making my decisions not just better, but also faster? Because again, I have a lot more credits to cover than any of my counterparts in the corporate space. So how do we do that in? And as I keep saying, I made this joke last year was like, I want to build Ironman suits for my credit analysts. They should be empowered to be able to quickly look through credits instead of spending time aggregating data. So jumping onto primary market, I think that's another aspect where we're seeing a lot of movements. So let's start off with you on what you are seeing in terms of trends around new issue pricing as well as technologies that can help the sell side in general may be issuers, bankers.

William Kim (41:57):

Sure. I think this goes back to the point on data and information. What clients or what market participants want is data that they're used to data that they can use. So instead of going pouring through 50 different OSS and putting together a debt profile having that information structured in a way that issuers to see it in a way that investors to see it as well as in every data format that they might use. So whether that's in Excel or whether that's in a DVC format or some other open data standard, you know, want to have that be available to participants. So we do a lot of technology work in terms of getting all that information and parsing it. And to the end user, what we're seeing in terms of adoption is if their workflow is the same, whether they're asking their analysts or associate or a research team to do it or whether they're getting it instantly from us, they're getting the same product. If anything, they're getting a higher quality product because there's no variation between people. They're experts looking at this, vetting it and checking it. And so I think the technology is, if the best technology is kind of hidden and behind the scenes, you don't have to look through all the details of how it's done. It's just there and ready to use. And so that's kind of where we see the primary market in terms of pricing. We've done some work on analyzing secondary trading, we've looked at break pricing where you have a new issue and then you have secondary trading the same day off it. And how those spreads change, obviously those are different for high yield versus more high quality credits. All of that can be done with technology. But again, it's about, I would take this back to the recent settlement that just happened between a different vendor and the SEC. We can offer these tech tools, but one of the questions that we have is, will we be liable for what we're doing When I build a AI platform and I produce a price and somebody loses some money on a trade is not my fault. Do I have to pay the SEC millions of dollars or not? And the same goes for the FDTA, right? We would love to offer these free tools to pull out in an open and collaborative way, all the financial information from financials two issuers. But I can't do that unless I work with regulators to make sure that I'm not held liable if there's any issue. And so it's not just technology, it's not just coding, it's also working with both industry players on the sell side and buy side and regulators.

Abhishek Lodha (44:26):

Gregg. You?

Greg Bienstock(44:28):

Yeah, sure. So when it comes to the new issue side of the market, we've been, initially our kind of help there to the market was around the 15th C two 12 piece. And so the idea of making sure that we're supporting both underwriters, bankers and issuers MA's as they're serving their clients or themselves with regard to reasonable diligence required under 15 C two 12. Where we've tacked to more recently that is still part of our business is around the idea of creating greater efficiencies for folks in the banking community. And I used the broader rubric there of municipal advisors, bankers, analysts, associates and the idea of creating tools to help with new issue pricing. And essentially what we've done a platform so we're not suggesting pricing to anyone, but what we're doing is we're giving a user the ability to input either a market segment that they care about or a specific deal that they're want to structure and they want to see where comps are in the marketplace and the ability to put in information and within seconds to be able to generate a scale and then have full transparency into all the underlying data. And then also the ability to control variables. What is it you want to see? What is it you want to compare to? Do you want to tighten a credit rating? Do you want to tighten a call schedule? You want to change a call schedule? Do you want to compare these? So the idea of what we did, and I will say it was not ours, it was a client of ours who said, I have too many damn deals. My bankers are driving me crazy asking for scales help. And so it was the idea of doing that. Where we've stayed away from is the idea of saying, this is where the deal has to price. It's that the end of the day, the underwriter, she or he is going to be the one who is going to have that final feel of the market no matter how current our data is. They're going to know that just touch and feel a little bit better. But we've done that piece. And then the other piece on the new issue side, that's been something that's been important again, and it's all about efficiency. When a client says to you, just save 90% of my time is this whole idea of around analytics, around debt and the idea of being able to understand all the outstanding debt for any algor in the marketplace, but click of the mouse and being able to look at debt service as well. So what we look at is the idea of, and it's all the feedback that we hear from our clients is how can we make you all more efficient? And someone mentioned this earlier, right? Going from the data aggregation piece and as opposed to that and being able to do the deep analytical work that you want to pay people to do.

Abhishek Lodha (46:59):

And those were great comments. Thank you. I think one of the things that also like I see in terms of change in technology is really how there's this mindset around meet the client where they are. So data is data or information or technology is being delivered where they need it. And that was not the case historically where applications and systems were more monolithic, which also means that applications have become more integrable and open access. And I think that's a huge change that benefits everyone in the industry. And it's not just our industry, it's across the board. Whether that's happened earlier, you would just use if you're a Microsoft shop, you would use Skype, you would use Word, and you're stuck in their environment but not the case now, right? With his APIs, it's everything is integrating. So that's a huge change and benefit to the market. So just in the interest of time, 30 seconds each, I'm going to ask you this one final question around, we've spoken a lot about how technology has changed. How should market participants think about onboarding technology? What should they be mindful of as their onboarding technology? Bill, let's start with you.

Bill Buzaid (48:12):

I would just say, because I did talk about the divergence before of firms that are investing in it versus those that aren't, say don't let it be too daunting. There are lots of different ways to solve the problem. You don't need to build something yourself. You can buy it. And honestly, given a lot of the tools that you have already, people can help you cobble together a solution that works. So don't be intimidated by the challenge. There's a lot of different ways to do it

Abhishek Lodha (48:37):

William?

William Kim (48:39):

You can save time, you can save money and make money. It's just an enabler, not a detractors. And I think you'd be surprised how far we've come.

Abhishek Lodha (48:49):

It's true, Jeff.

Jeffrey Previdi (48:51):

Well, as the non-tech person, I'm going to say, I'm not going to give you a tech answer, but I will say that even as a member of GASB, right, what I'm finding is that I'm an accounting standard center and I'm having to think about technology. I'm having to think about the effects that my decisions are having based on the technology that's being used to consume it. So it's becoming so pervasive. Whereas before I could say that's my tech department's issue, let them deal with it. I think it's becoming all of our issues as we think about the pervasiveness of tech and how it affects how we do our jobs and how I have to visualize in my head how those numbers are going to look based on how the different ways they may be consumed.

Abhishek Lodha (49:34):

And I think that's a great point. Every company is a tech company now, and I think we should be thinking that way, Greg.

Greg Bienstock(49:40):

Technology is not replacing people, it's making us more efficient will use the word enable. I know there's in some corners there, there's this fear that your application is going to take my job and it's going to make your job easier and it's going to allow you to do more value added work. So it's just a reality and it's not just in our space. It's in every aspect of our life.

Abhishek Lodha (50:04):

Excellent. Great. Thank you. So we're going to open it up for a few questions if there are any.

Greg Bienstock(50:18):

So we have five minutes to talk about AI now, right? Yeah. We

Abhishek Lodha (50:20):

We can talk about AI or blockchain.

Bill Buzaid (50:22):

We got a question over a couple question.

Audience Member 1 (50:25):

So you generated your scales through your software, your algorithms, whatever you want to call it. How far off to final pricing how close were you with your scales to the actual pricing?

Greg Bienstock (50:38):

So because it's not not set in terms of all the parameters, so it we'll do an initial set based on what you input, because the user has control that's going to really be up to the user. They have the ability to take control. So that's a component aspect. It's not like we generate it and therefore that's it variable. You have variables to control you. So if you decide, for example, to only include primary market prints or you include secondary market prints greater than a million and those are going to hit your numbers just. But you have the ability to control those data points based on your expertise and your feel of the market. And the other aspect just related to that clients will use the system also to do what we call a secondary market secondary analysis, which is where at price and then where is it trading two weeks later for all the obvious reasons. Yeah.

Abhishek Lodha (51:31):

Yeah. I hate to cut us off. We're over time. So I'd like to thank all my panelists. Thank you for joining and making this a vibrant discussion. Thank you for participating and we're here to chat later on.