Showing posts with label python. Show all posts
Showing posts with label python. Show all posts

Monday, December 20, 2021

A bit more Java here and there

 Some days, the cost of code is crazy.  The ability to build and code solutions to problems is not something everyone has the capability of.  But then when you see how little code it can take to solve a problem, you realize just how expensive code can be.  


With a bit of messing, I figured out how to write an event messenger in Python to handle a built in system integration.  It took me about 300 lines of code.  


Funny thing: a few weeks after I figured it out, a vendor offered me their solution.  For $15 per month, per site.  I kindly told them no.  At 20 sites, that's about $1 per month, per line of code.  Which is pretty nuts to say the least.

So I moved on, and decided I wanted something a bit more permanent.  My code runs as a nohup process on an old Ubuntu box that is acting as a syslog receiver.  Which made me think: why can't I write my own syslog receiver?  


Nothing really prevented me from doing it, but I didn't have a decent IDE in order to do what I wanted.  I've used syslog collectors from SolarWinds and Graylog.  Both were functional for what they do, but that don't do what I want them to do.  Alerting is part of it.  Data collection is another part.  But there's so much more that I think should be and could there.  Somehow that blend of SIEM and syslog collector always seems to appeal to me.  That gathering of "EVEN MORE DATA".  And I do understand that more data doesn't necessarily mean better decisions.  It just means more data to sort though.  And the need for more algorithms that handle that data, so the alerts generated off that data can be reduced from the thousands to individuals.  


But like I said, no IDE.  Microsoft makes a good one, but it's not free.  I like Visual Studio.  I just wish they gave it away for free until you get to a certain level.  Like the SQL Express model.  So I quit using their stuff.  And I moved to Eclipse.  


Eclipse is pretty decent.  The autocomplete makes me mad here and there.  But it's functional, and it allows me to build complex, multi-hundred lines of code applications.


Like my syslog collector.   That I finally wrote in Java.  

And that's about all it does right now is collect syslogs.  But I've got plans for that sucker.  Because eventually I'm going to take that 300 lines of Python code and turn it into Java.   And I'm going to attach a SQLExpress database to it, along with an IIS front end for a web application view of the mess I'm creating in the background.  Why?  Because I like web front ends.  

And because I want to learn more about SQLExpress and IIS.  And that's generally a good enough reason for most of these projects.  Or at least I think so.


Thursday, December 5, 2019

PRTG: Making Custom Sensors to monitor strange things with Python


#verion 1.0
#last modified 12/2/19
#
#v 1.0   initial revision / prtg integration
#
import sys
import json
import urllib.request

from paepy.ChannelDefinition import CustomSensorResult


if __name__ == "__main__":

    location = json.loads(sys.argv[1])
    
    parsed = "http://" + str(location['host']) + "whatever else in the url"
    
    page = urllib.request.urlopen(parsed).read()
    

    #data comes out as binary type.
    #convert from binary to normal string
    np = page.decode('utf-8')

    scrip = np.split('



    for lines in scrip:


    result = CustomSensorResult("OK")


         result.add_channel(channel_name="channel1", unit="Custom", value=status_value1, is_float=True, primary_channel=True, warning=0, is_limit_mode=True, limit_min_error=0.5, limit_max_error=1.5, limit_error_msg="channel1 failed") 
         result.add_channel(channel_name="channel2", unit="Custom", value=status_value2, is_float=True, is_limit_mode=True, warning=0, limit_min_error=0.5, limit_max_error=1.5, limit_error_msg="channel2 failed")


    print(result.get_json_result())


Base code for a python script sensor for PRTG.   

Using this method, you write a script to scrape a web page, and then present the information to PRTG as a JSON file.

The limits are used to define up/down/warning status.

In this case, I have outputs of a binary 1/0 for working not working.

So how does that work?   

Notice this bit.

result.add_channel(channel_name="channel1", unit="Custom", value=status_value1, is_float=True, primary_channel=True, warning=0, is_limit_mode=True, limit_min_error=0.5, limit_max_error=1.5, limit_error_msg="primary connection 

And breaking that apart, the section here
is_limit_mode=True, limit_min_error=0.5, limit_max_error=1.5, limit_error_msg="channel1 offline"

Let's break these down.

Note: these names aren't the same those expected or presented for EXE/Advanced sensors on the PRTG custom sensor page.

is_limit_mode = Sets PRTG to know that the output has acceptable ranges of input.   
limit_min_error  = This sets the lower limit that defines an error.  Depending on your output, there may not be one.  I'm using outputs of binary 0/1 in this case, so I set it to .5.   Therefore, a 0 output is defined as error state.
limit_max_error  = This set the upper limit that defines an error.   Depending on your output, there may not be one.  In my case, there is never an upper maximum error.   So I set it to 1.5.   
limit_erro_msg = The message you want on PRTG for any device that may be not working.


So, with these setting set correctly, PRTG will report an individual sensor is down based on the values you assign to status_value1 and status_value2.  So now, you can alert based on those settings using normal PRTG alerting.


What this base script doesn't currently do:
  1. Parse anything.  Parsing the web page is based entirely on what you are looking for.  The page I was looking at was all table based, so splitting the data into tables made sense.  You will have to handle that portion.
  2. Deal gracefully with urlib.urlrequest.open() errors.   You will get a JSON error in PRTG when you try to pull a web page you can't get.  That's a simple try/except statement.  Use this message in your except portion to report failure gracefully.

            result.add_error("Your Error Message Here")



Secondary important thing...   probably the most important.  

See this block? 

 location = json.loads(sys.argv[1])
    
    parsed = "http://" + str(location['host']) + "whatever else in the url"

This block accepts json data as input to the script.  
The second part location['host'] pulls the IP address setup on the sensor to feed that data into the script.   So this script can be written once and run on multiple devices.  That's what makes this script extendable.

Now...   

So, you've got the initial script working.

How in the world do I troubleshoot this thing when I suddenly get a bunch JSON errors when I deploy it?   

That's the subject of another discussion.

Friday, May 25, 2018

1,000 lines of Python


Did I ever think I'd intentionally write 1000 lines of python code?   Not really.   But I'm getting up there.

Python is pretty good for parsing through XML files and gathering the data.  From there, it can be used to compare that data to expected results.  Auditing.  

When I first thought of the idea of auditing Verifone Commander configurations, I never contemplated what it would take in time, code, and labor.   It's been a lot of all of.  But now I'm almost up to 1,000 lines of code to audit a Verifone Commander system.  

I wish Verifone would make their equipment scale better.  Enterprise level management would be awesome.  Then I wouldn't have to cobble tools together using Python, Powershell, and AutoIt.  

So how does all this work?   AutoIt is used to automatically backup every single site.   Once the backups are complete, the audit script will run over the files to examine what the settings are in comparison to what they should be.   


So at the point of originally writing this, the code was just barely reaching 1,000 lines.   It has since broken into numerous modules and is closer to 4,000 lines.  And I've still got about a dozen files to go.

Maybe I need to spend more time researching better Python coding.  Or a way to organize libraries better.  But going through 1 file that's more than a 1,000 lines of code is a pain.  So it's easier to break the things into separate modules. 


I guess the other part of this....  is it worth it spending probably 40 hours writing an estimated 6,000 lines of code to audit a system? 

Yes, yes it is.


Sunday, September 18, 2016

The Boring Details

I've been spending a lot of time contemplating automation recently.  Automating things is rather great.  But I think there is an unwritten side part to automation.  I'm going to write that down.

In order to automate anything, you must first document the entire process.

After reading that sentence, you are probably thinking a lot of sarcastic comments.  I'd like to agree with you, but the stupid simple is what most people miss in the first place.  How often has business classes shown case study after case study of ridiculous levels of bureaucracy that can be removed and processes that can be streamlined by knowing the process. 

But then that involves a lot of boring drudgery.  That's the part that no one does. It's a simple thing, but doing that simple thing is all that really needs to be done.   By the end of the process of documentation, you've got an in depth understanding of the events that take place.  Often in the process you start thinking about why certain things are done, and you realize just how much time you can solve by automating.

I looked at the same idea when I was fighting the Windows Automatic Installation Kit.  Sounded like a great idea.  I could never get the network drivers to work on my builds.  So I basically burned through a lot of crap and none of it worked. 

So after that, I went back to partial automation and partial manual.  If part of the process is copying files and creating directories, why not automate that?  A batch file is perfectly acceptable for that and it becomes automatic and the same everywhere. 

I want to do the same thing with network discovery, but Python is giving me hell.  Something I'm not certain of is causing me problems.  I can't get the data file to create.  

Anyways, I guess this is the call to do boring but important things.  Documentation is boring.  But it solves a world of problems.  It also gives you the ability to solve all sorts of problems in the future.  And it gives you the best ability: delegation.  If you have something well documented, you can then delegate the task and give it to someone else.