← All talks

GT - Behavioral Analysis from DNS and Network Traffic - Josh Pyorre

BSides Las Vegas42:02100 viewsPublished 2017-08Watch on YouTube ↗
Mentioned in this talk
Platforms
Service
Frameworks
Languages
About this talk
GT - Behavioral Analysis from DNS and Network Traffic - Josh Pyorre Ground Truth BSidesLV 2017 - Tuscany Hotel - July 25, 2017
Show transcript [en]

now we're going to I'll pass over to Josh this is going to be an excellent talk about behavioral analysis using DNS and network traffic thank you alright so I'm gonna talk about this it's a work in progress I'm still doing stuff I have code that I've been released that you can play with at the end as well walk through just my thought process and what I'm working on don't give up my history cuz where does the site I worked at NASA probably helping build their security operations center I worked there for five years and took a break and went over to mandiant and then end up going back to NASA for a little while and then

joined OpenDNS and then they were acquired by Cisco so I do Dayna stuff and security research had opened by Cisco umbrella so I've been doing security maybe 20 years or so so when you think about behavioral analysis you might think about like this kind of thing where you're tracking how a user browses a website where they're go where they might click where they might look things like that but also this is much more exciting stuff you might look for weird behaviors in in movement through traffic of people and things like that but oh it's kind of cut off well hopefully it's okay I'm using a different slide mechanism because I'm using a Linux laptop and that's exciting

I guess so first I'll talk about some detection methods that we currently have some just basic stuff we've got IDs antivirus and people of course IDs is based on signatures so this is a signature looking for TCP traffic from external network to your internal network with a message it's looking for PHP content and some regular expression stuff but it's a rule that it has been written to find stuff that you know exists already requires human event intervention so you need to see something that has already happened or know about something that could potentially happen and try to guess what's going to happen on the on the wire so caches known threats it's not exactly predictive that's what we're

trying to do is be pretty or at least be faster antivirus of course rule based hits late usually to the game because the new malware variants are always written and they have to be caught and classified image does it have to be downloaded things like that its host based so it looks for known threats we're working with things like CSV files and I'll be showing you some examples of that pcap files I do that a little bit and we're still working on that active directory logs I tackle that just a little bit proxy logs DNS logs you know any kind of logs I can get typically when I get a log file sometimes I don't have the wire

I'm not on the wire but I get a log file it's a company I might get it in the CSV and or my giving anything aspects like an Excel spreadsheet which is terrible and there's some things you can do kind of gets an idea of things I can kind of get a count of how many visits to a certain domain a bunch of domains you get or like I can kind of graph that get an idea of what what's happening inside that environment and it's not so great as I'll show it a little bit when it's when it's huge but so one of the things you want to do is you want read it like

when you're looking through that CSV or through that log file you want to find normal so define normal so you can kind of clear it out of there so how do you find it sorry this text is small but it's just a bunch of domains they're all suspicious their DNS tunneling domains and they just watch it and kind of scroll through but you may have something more like this where it's somewhat normal looking traffic going over your network or in that log file or whatever and then they're casually there's a weird domain like that long dns telling domain and of course NOC how do you find that in this kind of thing when it's that and I spend a lot of time

dealing with this kind of thing I pulled log files from the resolvers of Open DNS I just you know it's terrible you've probably all done it well how do you find it in that CSV file and there I'm they're all different too so so one of the first things I tried was I'll count the number of times a domain is seen I'm so sorry that it's cut off if this bugs you I can always adjust it and it'll be a little weird for me but I could probably do it but we'll see so I on this one particularly log file I don't remember where I got this I'm counting the number of domains for example I'm looking at Facebook and

maybe I'm eating all the subdomains of Facebook and I'm just counting those maybe I'll graph that just to get an idea of stuff that's not really that useful I might count other domains just to see what's in there there's some there's a couple mistakes that anyway I'll remove what I think is normal so maybe it's above a certain threshold of visits look at the mains visited just once is what I want to try so I'm left with this slightly smaller list it's still difficult and I'm gonna show like the manual process that we all do so you grab a say I grab a random selection from that list and I've got this and I just look at it and my gut says you know

these highlighted ones they probably aren't bad so I'm gonna remove those these highlighted ones they are really bad I'm going to remove those ones too so I'm left with these so far look at this one maybe I've visited in tour or I go check it out and one of the many ways you can check them out and I see it's some kind of health care site so I'm just gonna remove that and I look at this I can see there's a default Apache page or do some searching virus to look someone virus someone in virustotal like thought it was its SonicWALL something so cool so as long as discussed it I'll add this mom dot me it's a parenting

blog doesn't mean it's not bad but I mean it could be compromised I'm just but I'm just kind of clearing domains out so I've got this one and then able to connect there's no HTTP content I maybe I scan it for other things but I just looking up its Microsoft so look this up same thing that's some kind of verizon spam thing maybe that's bad to you I don't know I don't really care so got this one no page no content look it up there's a little bit of something like threat crowd but I'm not getting much else I'm soon across a feeling it's not bad but not and passed out that's forbidden that doesn't mean

it's bad or good or anything it's just there's no content there but looking at it like just a Google I can get some hits all of a sudden like I'm malware comm and there's some files associated with it like at hybrid analysis and things like that so that's something you can see it's been identified this is using another source by various AV vendors as malicious virus total sees all these URLs that look just like lucky what lucky would do and and you can see it's using investigate which is like a Cisco thing or previously Open DNS thing you can see there's a spike in traffic at one point and before that there was nothing so we got one it's cool

it's a lot of work process is really tedious and sometimes you can spend all day doing that it's mostly manual requires expertise so you need to know yeah the nap time that gut feeling the attempt that that's just a pain so part of the solution is like hopefully automation I dream a programming myself on a job I started as a society a long time ago and one day obvious assignment again maintaining the machines that do whatever it is I do they'll be kind of sad actually probably quit and go to the forest anyway ok so what do we automate I'm gonna be Auto cleaning logs and network streams auto process that stuff I'm gonna find ways to remove normal

like I said without me having to do it like I did I'm gonna categorize thing things that's a big part of analyzing the behavior of what's happening inside an environment save it to a workable whatever I'm using MongoDB as my whatever so and I'll do some visualization using d3 and some JavaScript libraries and with a flask framework and all the stuff like I said is I'm github so I figured what that says it's all different so we're looking at like different kinds of files so DNS we've got different timestamp formats items in multiple locations or different locations and we need trying to pull them out from this raw text pane so you stuff to do some manual work initially

I'm working on on the next step which will be some natural language processing and some machine learning stuff but I'm still new to that so at the moment we still have to do some manual work so if you want to try out my code at some point you might have to move it around a little bit but you'll see active directory you get stuff like this and you can pull it out and then process it to get what you need but it's you're gonna have to do some manual work there system logs if you're one a mess with that I actually look at some off logs coming up in a little bit HTTP logs if you want to analyze that

kind of thing there's plenty of services that will do this but if you want to find behavior I mean maybe there's additional work there so the manual process as I've explained needs to go away so you want to use what makes sense to you and like I said this is Excel so you want to pull in a huge file this is probably only 300 megabytes at this time so we wait and you can see this well actually you can't see it but the process announced over a hundred percent it wants to imported into multiple worksheets so then you want you want to force quit Excel at that point and move on to the next thing so please don't

give people log files in Excel I mean sales will always do it so but that's the way our which I would have liked to play around with a lot but I have not been using it in this in this it's a lot faster than what I've been using which is Python so I'm doing everything with my phone and some shell scripting but I also have some tools I use map plot to do some initial kind of graphing and then I move I use pandas to kind of move my data around plotly I used initially I've kind of away from that but I use using the offline version of plotly d3 ec3 and one library I use heavily I just started

using a couple of weeks ago because I'm getting constantly working on this thing was metrics graphics for some timeline timeline visualization so and then some open source tools I'm using it and this is one thing I bothers me I I mean I work for a big company and I've worked for big companies and I they want to sell their product and I believe you should be able to do all this stuff for free you should be able to I mean this to be open source unfortunately I'm still using some closed source stuff but I'll get to that and I'm working on trying to find sources for my data that is open but I'm using pi IAS and like an

offline ASN lookup I'd like to make it so that this is pretty much all offline once you've done it an initial process of your logs this is terrible small but it just gives you a view of like it looks up domain the IP it's sitting on and the ASN and owns an IP handy for a quick look up stuff python who is using network x as well to graph things so like in that exact same example it doesn't matter i'm type in there it's gonna pop up a little graph of that exact same kind of pi is and things so you can kind of see those we're starting to get some visualization of what's going on with our with stuff

so if you have a whole bunch of this maybe you can find that there's some weird anomalies in in there and there's ways to make it fancier so III get fancier but not as fancy as I could get yeah there's some paid tools I'm using Open DNS investigate could say I work there so it's a valuable resource it looks like this if you this is not a vendor talk I'm just you know so you get an idea of how it works you can see all the traffic going through the resolvers it's like 3% of the internet maybe a little more than that now you can get some other data like the email address that registered a domain I don't know

why I picked this domain but you can get the IP it's it's on the IPS it's at on before like some passive DNS kind of data you can get the geo location like that who's been visiting it and and some information on them too so it's cool and you get co-occurring domains things that that are contacting the exact same time or very similar very close in time which can help you find some interesting things about that domain and the domains it's connected to and this is just the web interface there's a whole API so I use that heavily the survivors total I'm using as well you can send if you do it free you can

send for requests a minute which it's good for testing but um if you have a paid one you can send as many as you want I don't know what but it's good I have cool stuff for example uh if you use their web interface and you say you look for that same domain I don't know why I did but I should have picked an interesting domain maybe it was interesting to me I don't know you can go look at samples that are associated with the domain that maybe they were found there maybe they're calling out to their I don't know but you can call all this data including other stuff like categorization from virustotal you can

also get a categorization from investigate I want to find categorization in an open-source form but I can't seem to find that right now a reliable source because it's pretty much human based you have to people have to categorize things so I'll find ways to remove normal I'll create a baseline like I said before I'm counting things and removing popular initially I'll start with the ADA from Open DNS so you know I said it's approximately three percent of the Internet so how do you find normal can you find it we'll try so what's this do I look at this this is like a some text from one of the resolvers I just pulled ten minutes down

or something like that it's a bunch of different stuff different calls to different domains we'll try it anyways so how many times a domain is queried over ten times will read to normal traffic under three will write to a suspicious traffic a bit suspicious bit yet but it's going to look like this you know nine minutes got the timestamp a domain yeah so some other stuff because the record type things like that those are useful this is what the what looks like it's that many lines six hundred sixty-four thousand it's a lot of lines in nine minutes when I do that I can normal traffic ends up being about 250 K and suspicious is still a huge six point

four megabytes I look at the normal traffic it is kind of what I expect like there's a lot of Google in this little screenshot I have here suspicious looks I mean maybe it's suspicious there's some stuff in there there's a Russian website one style route which I would think a suspicious because I'm suspicious of Russian websites but I look at it you know through investigate it looks kind of like a normal traffic pattern no big weird spikes I go visit it it's just the one side doesn't mean it's not compromised I mean I assume every Russian web site scholar best but anyway so there's another site like this one which I'm sorry it's cut off

but a good look at this one and it it was blocked by someone at my work probably it was something for being a dropper for locky but there was something in there you know and I could look at it to take down page I think that's what it says exactly what a spectrum a Russian site or that was a Russian that was Kazakhstan but is it okay so what I thought maybe it'd be better that's kind of labor-intensive like not me laboring but the computer librarian to remove the catheter mains and stuff so I thought maybe I'll use the top domains list so Alexa used to provide a top 1 million domains top most popular at this very moment to people

for free which is awesome but then I think this is sort of pay charging for it so then we at Open DNS Cisco started we did we were like we have this data we can do this so we know we have it and we release it to people so if you there's some ways you can do it besides that too if you like to search for the most popular websites you in a Wikipedia entry and you could probably scrape that too which I wouldn't really want to do but you can also I guess I could have talked about what I was talking about as this was going but you can go and get the what top 1 million at

any time sites from this place and it's just a CSV file sorry and you can use that to compare it to your data so so doing that I just it's a simple little Python bit of code and just comparing code and you don't have to worry about it and this is cut off but using that I get I get two files not in top domain and in top domain and it changes the nature of things a little bit and looking at the two files I can see you know there's a bunch of google.com stuff with alright on the other side there's some interesting things you know on the right sides that looks a little more

suspicious things that aren't as popular so at least you can cut away some of that noise and you still may be getting missing out on something but that says our data has been reduced it's kind of a pointless life so security categorization this is it's not categorization like like this person's visiting blogs it's like this person visited malware sites or this this connection is out to botnet stuff so unfortunately your most KCC have to use a third party service at this time but you can take that data this is still using that that Open DNS data and do some security category categorization so out of all that nine minutes you know there's some botnet traffic's and

dynamic DNS there's no counts here but it's just seeing that there's a huge amount of blank no categorization you could process through that with another way so you may need to use multiple third parties you can also do a time line this is terrible but I you can kind of look in the bandwidth of it to see what's going on throughout the day or in the nine minutes it's random data though from multiple sources and I'm sorry to put you through that I just showed you all the data from Open DNS like for nine minutes so it's random so it's kind of hard to do behavioral analysis without context so we're gonna narrow the focus

so this is from one organization I don't remember where I got these logs but it's it's a lot of lines it's like at 12 hours 44 hours you can see it's really small I'm sorry but it's um whole bunch of domains and there a lot of Direct TV visits so I could go in there and remove that stuff she's so annoying I clean it up a little bit some of its manual but you can do this automatically and I get it down to eventually having about 320 lines of domains that are suspicious you know I can look at that and I can see this one has some mistakes in it like there's a comm and a dot that

I can see there's a 666 business - yeah crew but the some other ones those like Def Con on there very far right over there at layers reporter it takes a little while initially it was a two-game ticket to gigabyte file but then I decided I want to try this running this on streaming data instead of taking log files and doing static stuff and just having a hot ematic for this stuff is his stuff so I didn't have a DNS server running in my house yet then I had access to that's where my router so I'd ran TCP dump but I just dumped it to a file and I would look at that thing so I'm just on a tap

and it looks like this it's a terrible mess it's like you can see there are domains there and there's timestamps so that's that's what I want in this case I can remove some things using the top domain list or removing normal in other ways and I'm left with a bunch of these is just a one-time visits I can kind of scroll through that this is kind of weird just like a IP and since it was TCP time I got an IP is it's not DNS I can look at the the amount of times I visited something I can see there's some access to a lot of access to private Internet access which is a VPN client or

VPN service so that might be suspicious to me if I'm not expecting it I was expecting it because that's what I do when I'm watching myself but and you're watching me watch myself so we can compare activity so this is more this is kind of rudimentary but it's a like I looking at the bandwidth throughout the day of my traffic so this is earlier in September so it's 24 hours you can see I'm at a concert right there I mean I know this you don't know this but you can see like I'm probably not home maybe there's less traffic I have someone else at the house so but you can see what I'm kind of at work you can

kind of make some assumptions here when I'm home so on and so on there's some patterns and if you get enough of this you could probably find something interesting sorry the color change I moved to different server when I used to processing through this it's not a lot of traffic when out of town still out of town this is much better graph than those other crappy graphs but I'm home right here I'm at work right there The Times weird I guess I was doing nothing before that and then I'm at a concert I like to go to concerts and then I was kind of pumped from the concert there's a lot a lot of Netflix you so you you

know you know that cuz I'm telling you but maybe you're like why he didn't come home no traffic and then there's some traffic a little bit and then all of a sudden those Netflix like so you can all stay you can look at it in a bigger scale and um if you were to play this much faster maybe you get a get a good idea of what kind of things I'm into I mean what kind of activity uh my my day-to-day Internet activity and what machines I might have on my network if I have a lot of noisy machines things like that so then I was like I need to get better at this I'll do I stalled piehole

which is a DNS server for Raspberry Pi installed on a server and started sending all my data through that so I could get the log file in a much cleaner way and it reopens along every day so I just have a cron job running saving it to this log file it's been doing it for about nine months or so just and I just process it through all my stuff so it looks like this like in the beginning at the top it's cut off it makes sense that's cut off when you can see the whole slide but January or maybe September or I can't my when it started but ended up - I have data in my demo

that I'll show you up to yesterday so here's the point where we save - and workable whatever so I'm doing MongoDB looks like this when I'm at when you manually process it when you just take a log file I mean yeah I can do this streaming but I chose it as an example you can just the manual thing so you're it starts sending it to different areas and the collections in the database like domains in the top 1 million domain it's not the top 1 million so I can break things out and separate them and so it's like one wasn't the all the domains going through to is in the top 1 million and then 3 it's going it goes a little

slower it's not in top 1 million so it's just yeah and that's where kind looks like I'm inserting into the thing I tried some other visualization like Griffin ax influx DB influx DB is time series database and thought it was cool to look at I don't have that in that code at the moment I actually have a function you can uncomment you can send it to that and then logstash elasticsearch kibana i need to add that to my code but it's um I've gotta log stash files and stuff that will take your data and present it in cool ways off which I'll show you soon so and then I used Cabana to display some things and

then I decided I like to do custom stuff which is actually kind of a little bit slower slower to create but it's a fun process so in flux TV I have it going into these different like databases and it kind of looks like that you can see at the top is all the data middle is in the top 1 million bottom is not top 1 million and it's over a time line so I can get an idea of I mean it's still rough but I can get a starting to get a picture of maybe what my bandwidth looks like and you know looking at the the not top 1 million list it kind of helps them

I can see that there maybe there's something going on if I want to dump a little deeper then I did elastic search you can do a time line kind of thing but you can also do gold dashboards so you can get maps of where you're visiting the kind of categories that you're going to which is probably the most useful part of this like what categories do I visit one so and then the custom one I'm doing I'm Auto sending and streaming data in the MongoDB and then I have plastic out of processing from longer tbh I pull it pushing the MongoDB so it's very fast I don't have to keep processing their manes over and over

again with the third parties now there are sources I have and then plastic serves the content so I can do things like this I can give I can look at a number of things like requests to overtime you know and you can do a ton of technical things with d3 you can look at investigate security categories these are like a lot of like these are blacklisted versus whitelisted according to open DNS and of blacklisted being blocked domains we're gonna categories podcasts search engines you know software technology etc and you can have these automatically update with the data you can get the virustotal categories dividers total has a bunch of much better category engine or something so I

ended up combining both of them if it's not found in one it gets it from the other and I thought this is much more interesting if you do a bandwidth I'm sorry a time series instead of a domains you visit of the categories that you visit so I'm continuing to work on that um but it looks kind of like this small screen sorry so I can I this is kind of a weird way to do it but I can zoom in and I can look at the categories based on the time which gives me a better indication of what kind of behaviors going on inside my network or the network I'm watching hmm and then

initially I started with things like this like um counting things and making kind of a basic webpage doing some just like a dashboard sort of thing that file I found means there are no bad domains in this specific group that I looked at and then I started making this a table tabular thing of data and I've improved upon this mm-hmm and I can do things like I'm kind of going about being I'll shock let's just show you this let's see here or well I'll show you like a time to domain you make like a forest graph I mean it's kind of cool I put a grumpy cat in there yeah see that's a little grumpy cat up there but

anyway so you can use network X to point to like make a forest graph this is it's cool I mean near la data it's not past cool I guess but I mean you need a lot of data you get stuff like this we're really busy but you can kind of start to see some patterns it's just a interesting way to look at the data this is another way of looking like a kind of a tree map of you visited this is connected to all these other things it looks like this sometimes terribly so I would recommend doing it exactly like that I'm finding better ways to do that you can do to snapping you know like you

can if you're using third-party stuff like investigate and I think even virustotal you can look at the places like I mentioned earlier the the locations that are visiting this domains that you were visiting as well which is interesting and can give you more context so most recent stuff out I'm actually not sure your most recent stuff so I'll just start like a among go database here and then start up like a flask mask stuff and see so it's not super super-great yet but I've started to make it and I'm making it so I mean it's available to play with and it's a little slow at the moment but you can do certain things like I can

have this combined graph over here with the different categories of things I can also do like like this gotta take a timeline of of its sorry it's cut off oh I guess it's scroll over that's good okay so I can kind of hover over things and see there's a bad a bad domain visited on this date bad domain on this date good domains all this time and neutral domains that aren't going to classify as anything then I can break it down to not in a top domain list or in a certain count of domains etc and see like I can look at like a better kind of mapping thing well then I can if it was

if it was bad it would see a red a little red dot you know maybe maybe you expect that in your network there people are only going to visit domains inside a certain geo location and they're also visiting a lot of them in China for some reason or or ip's or anything something that might indicate some kind of strange behavior let's see what this is so you can get through like categorization and this is kind of a mess I need to fix this and sorry it's gonna have time it's all squished together but it shows you the the majority the category is like the most of what's been visited you know and separate in those different groups

and things like that this is that terrible graphic that I showed you before but you can like do it a different way which is maybe cleaner like this you can kind of drill down into the categories and that's all squished together too so I'm going to fix that but and then you can also just like search things or like look at the like investigate status in this case things that are considered bad so or you can look it up using various tools like you can change these URLs to go to other places I haven't looking up things inside investigate you can get information on the IP or the ASN you can get a location from Google Maps if you

want the specific location of that thing and its category and I'm going to be adding the content there kind of category this is a security category but I need to add the actual like this this is this kind of thing this is like a blog or something let's see I actually have a well there's some other stuff that well it's kind of out of order with the slides but I'll just I'll show up I'll show you this so using network traffic captures maybe you want to pull some stuff from malware traffic analysis and play with it so you can do that with Python I've got out there to you grab the HTTP traffic and generate statistics

so maybe you have packet capture you look at it grant looking all the HTTP in Wireshark and you can see follow a stream this is a kind of slow way to do things the way that analysts often work you can see there's a someone posted in clear text email username and password then but if you also you could just process through these pipes through these peak apps and create some graphs and create some a list of all the gets in the post and you can start watching for a certain specific kind of activity in that stuff so in this case the kids or the posts maybe I just want to highlight those through Active Directory

logs so I initially started I'm just doing successful versus failed logins I started with this but then I actually have a better looking demo which I'll show you so I've got I have this I'll show you this first it's kind of small text but here's what the audit lines look like it's I just took my Active Directory logs from this source I got and got out the dusty account logged off logged on you know account failed the log on things like that and I processed through it with this Python script and it's just it doesn't give me any output but let's see when I look at it I can do things like this I can see like kind of a pattern

throughout that time which is useful because if you get a lot of failed logins at a time you shouldn't get failed logins or you get a I mean you can customize it however you need to be and you can get some idea of the what's going on quickly additionally I have we try it with like auth logs like so this is just session open closed you know logins to my to a server I have out there let's see so

I think I do the same thing and see like in ballad SSH longing attempts this could be useful too and I'm doing this manually but of course you can have it streaming and then I'll show you one more hmm so client connection analysis I was thinking if you have a bunch of data and you want to know you want to separate your your traffic from the clients into their own time series it could be useful so say I have a file here like I got us from info info blocks and looks like this I modified the data this sample is actually up on the github so you can play with it if you want but

I have like four or five clients which you can see like here's a client right here and I have a place they visited you know you get the idea and some other stuff like time series data if I want to separate those I can do like this isn't actually using some out port so it processes the data and it gives me this something about this it gives me these JSON files and when you open this up you can get kind of a list of or the display of what they look like there's a lot of similarities in traffic patterns you can see this looking at this but there there are differences and there's more for one the

person together so that's good that can be pretty interesting to combine with a dashboard so I think I show that stuff that's what I just did I wrote a lot of code for this it's it was a mess it's still a mess but I've I've cleaned it up a little bit I mean I've cleaned up quite a bit I've really commented it heavily so but you know this is my how how you do it everyone brute forces their way through it all I do but it's not it's it's better than that if you want to mess with it you want to play with it it shouldn't be too hard so it's not a perfect solution I mean my stuff

is not certainly not a perfect solution but it's an attempt I mean I'm doing I've been analyzing all sorts of things for so long and I it's I just wanted to stop but I don't wanna stop my job I want to keep working you know so there's this he's got this is I'm just part of the the battle you know we're all supposed to build this thing out you know intrusion detection like you know IDs and AV and especially user awareness not that that really helps all the time there's still have their place you know analyzing behavior can decrease the amount of work you have to do which is great because right it keeps going so

there's always something to do it provide more visibility at least give you quick visibility into what looks like a jumbled mess and hopefully I mean I know it's gonna alert that's also not always eventually there are companies out there doing this I mean it's a big thing but if you don't want to be watched just a little little thing you know use a VPN people will see that using a VPN but might be suspicious use your own DNS server and send it through that VPN you know things like that you can learn a lot about who you're watching I I think I took it out but I have a interesting display of the interesting sites that may or may not

have been visited my house but yeah so anyway you can get all the code here it's github comm /jp ori my last name and it's being a real analysis at the bottom so do you have any questions you can also contact me through email through whatever and if you want help or if you want to like program together I don't know I love actually love looking at log files and looking at streams and stuff so I love fighting ways not to look

all right I have done suffice plunk I I kind of said trying to do free things but I've done lots of stuff alone this looks great but I currently don't have access to some TV cast link so sure so my I think it's a 500 Meg per day limit yeah work for free that's like MPNs is probably gonna blow that out oh yeah and really quick but maybe can pre-filter yeah I look is great I mean but you can if you know your locks - well you can do some really good stuff with kumano yeah um have you done any work with net flow analysis or anything similar yeah so I've used commercial products like cue radar for net flow analysis and

I you know that's I need to do more with that not with curator ever with my stuff I mean I've sorted some stuff like with the I bet I barely touched on it but I I know it's really important I need to work on that because it's awesome stuff yeah okay one if there are no more questions thank you very much that are you reading [Applause]