Googe GSuite. GSuite (formerly Google Apps) is a set of cloud computing, productivity and collaboration tools, software and products developed by Google. It includes Google’s popular web applications including Gmail, Drive, Hangouts, Calendar, and Docs. According to Google, more than 5 million organizations use the service worldwide, including 60% of Fortune 500 companies.


Google has an extensive framework of APIs to interact with G Suite for different tasks but our focus on on the Reports API. There's nothing technically preventing the script from pulling any sort of data from the API, but there likely won't be a parsing rule to parse the event. The script can be cloned/downloaded from: GitHub - andywalden/gsuite2mfe: Send events from G Suite to McAfee SIEM. or there is a zip archive of the latest release hosted here available as well.


I write my scripts in Python3 which is not nearly as ubiquitous as Python2 and is either available on Windows so some assembly is required. In fact on some Linux distributions changes to the default Python2.7 version can cause the OS to break so it's even more important that a separate environment is created for the script to run in. I go through the details of how to install Python3 and virtualenv for Linux and Windows in detail in this post.  This doc assumes that you have the packages installed on your platform of choice.



Setting up the script

1. If you're using Git to clone the repository, do this first as Git requires an empty directory.

    $ git clone


2. Otherwise create a directory and unzip the repository zip file into it.


3. Use virtualenv to create a new environment for the script to run in. Specifying the Python version is important.

     $ virtualenv -p /usr/bin/python3.5 gsuite2mfe

For Windows, only Python3.5 is assumed to be installed so there's no need to specify the interpreter.

     C:\Users\User1\Nitro> virtualenv gsuite2mfe


Virtualenv works simply by setting some environmental variables to provide a different set of paths and libraries for the script to access while it is running.


4. Once the script is in place and the virtual environment has been established, cd to the script directory and enable the virtual environment with:

  $ cd gsuite2mfe
  $ source bin/activate

Or on Windows:

  C:\Users\User1\SI> cd gsuite2mfe

  C:\Users\User1\Nitro> scripts\activate.bat


5. The cursor will change to indicate that the virtual environment is activated. Now the additional modules need to be installed into the environment. This is easily done in one step using pip. The command is the same on either OS:

  (gsuite2mfe) C:\Users\User1\Nitro> pip install -r requirements.txt


6. The next step is to edit the config.ini file in the script directory. The primary setting here is the IP address that the events are to be forwarded to, e.g. your Receiver IP. There is also a setting to instruct which 'activities' the script should be querying events for. By default, only login is enabled. To enable other activities, add the name of the activity to the line comma separated and without any spaces. Make sure the activity is only listed once in the line. Save the file.


That should provide you with a functioning operational environment; now authentication needs to be put in place.



Setup Google API OAuth Credentials

  1. Go to the Google API Manager.
  2. Create a new project called 'gsuite2mfe'. It might take a few minutes but then will refresh to the API Library.
  3. Enter 'Admin SDK' into the search box and select the link.
  4. Click the Enable button at the top of the screen.
  5. Click Credentials on the left menu bar.
  6. Click Create Credentials then OAuth client ID.
  7. Click the Configure consent screen button.
  8. Enter 'gsuite2mfe' as the Product name shown to users. Click Save.
  9. Select Other as the Application type and enter 'gsuite2mfe' as the Name. Click Create.
  10. You will be shown the client ID and client secret. Click OK.
  11. The credentials will be listed under OAuth 2.0 client IDs. Click the Download button at the far right to download the json file. Save it as client_secret.json and put it into the script directory.
  12. Run the command: python --noauth_local_webserver
  13. Paste the link into your browser, click Allow and copy the supplied code.
  14. Paste the code back into the terminal with the script.
  15. The script will return the last 10 logins to show that it is working.


That takes care of the last of the setup tasks, now the script can be used to poll events continuously, ad hoc or both.



Continuous Polling

Continuous polling of the events is the primary function of the script. It is designed to be called at interval to query the GSuite API and forward new events. Separate bookmark files are maintained for each of the configured activities in config.ini. The first time the script is run, the script authenticates to the API, validates the configured activities and creates the bookmark files using the current time. Any newer events will be forwarded to the configured host. For Linux, cron is able to provide interval-based execution of the script and Task Scheduler can be used for Windows.


Since the script is run in a virtual environment, a shell (.sh) or batch (.bat) file is used what is actually called by cron or Task Scheduler respectively.  Examples for these files are in the repository. The path to the script will need to be updated before use.



Linux - Crontab Configuration

To configure cron, use the command: crontab -e


You may be prompted to choose an editor. Usually the default is preferred.


Scroll to the bottom of the file and insert a line similar to the example below. The path will need to be updated. You will also need to determine how often you would like the GSuite API to be queried. For what it's worth, I've noted events show up after various amounts of delay ranging from 3-20 minutes after occurrence. The interval could be as low as per minute but every 5 minutes is probably just as effective and 5 times more traffic efficient. The line to enter is:


5 * * * * /home/user/gsuite2mfe/

Save the file (control-x | y or ESC : x!).


Note the .sh extension. The 5 represents a 5 minute interval. The other stars represent hours, days, months and years. Most folks would query every few minutes but there's no reason it couldn't be less frequent in low activity scenarios. The goal would be to avoid retrieving so many events at a single time that aggregation would be triggered unnecessarily.


The script will not produce any console output by default so cron won't email you results every time it runs. If you do want the results emailed though, make sure there is an MTA (e.g. postfix) installed and use the -l debug option to generate console output.



Windows Task Scheduler Configuration

I created and exported a working Windows Task Scheduler event and included it in the repository. You can skip this part of the process by editing the file in a text editor to customize the user and path and importing it into Task Scheduler. Search for CONFIG in the file to find the three lines. Otherwise, via the GUI:


1. Go to Start | Run  and type Task Scheduler.

2. Go to Action | Create Task.

3. Set the Name and Run whether user is logged on or not.



4. Configure the Trigger similar to the screenshot below. Adjust the repetition as needed.



5. Specify the path to the .bat file that calls the script.


After the task is created, you can highlight it and click Enable All Tasks History on the right to allow you to monitor the execution of the script.



Parsing the Events

Included in the repository is a rules.xml type file that includes the parsing rules that need to be uploaded to the ESM. The rules cover most of the admin API and some other combination of events I was able to generate but there will still be some unparsed events. Also, some of the events have quite a few fields but only the core fields have been mapped. I will be improving these over time but it's helpful if you send or post any unparsed or poorly parsed rules to me to include or fix. It's always possible to edit the rule in the Policy Manager to drag and drop the parsed data into the desired fields. I wrote a separate post that describes the procedure to upload and enable the rules that can be used for reference. Please refer there for the detailed instructions. If all if your events are Unknown after the script is in place, you might have skipped this step.


Script Options

Now for a closer look at the script options. The -h option shows the details:


     (gsuite2mfe) aleister@tool:~/gsuite2mfe$ python -h


     usage: [-h] [-s] [-e] [-t] [-w] [-f] [-v] [-l] [-c]


     Send Google GSuite events to McAfee ESM


     optional arguments:

       -h, --help      show this help message and exit

       -s , --start    Set start time to retrieve events. Format:


       -e , --end      Set end time to retrieve events. Format:


       -t, --test      Disable syslog forwarding. Combine with -l debug for console


       -w, --write     Write events to log: gsuite2mfe_events.log. Use -f to change


       -f , --file     Specify alternate path/filename for -w option.

       -v, --version   Show version

       -l , --level    Logging output level. Default: warning

       -c , --config   Path to config file. Default: config.ini


If you run the script without any arguments it will  query the GSuite API for the activities specified in the config.ini and the events will be forwarded to the configured host.


-t | --test

If the script is run without a config.ini, the -t | --test mode is automatically enabled. Enabling testmode prevents any events queried from being sent to syslog. Even without the config.ini, testmode will enable querying for 'login' activity logs. This is useful for testing or querying for historical events to avoid having duplicate events sent to the SIEM.


-s | --start

Events will be queried starting with the specified start time. Format must be in RFC 3339 format: 2016-11-19T14:53:38.000Z and is always in GMT. If the end time (-e) is not specified, the current time will be used. This option can be safely used while the script is simultaneously called at intervals in the background. This allow for historical queries to be executed without impacting normal event collection operations. This should be combined with the -t flag to prevent queried events from being sent to the configured host.


-e | --end

Events will be queried up until the specified end time. Format must be in RFC 3339 format: 2016-11-19T14:53:38.000Z and is always in GMT. If the start time (-s) is not specified then the API default time frame, past 180 days, will be used. This option can be safely used while the script is simultaneously called at intervals in the background. This allow for historical queries to be executed without impacting normal event collection operations. This should be combined with the -t flag to prevent queried events from being sent to the configured host.


-w | --write

Events queried will be written to a file. By default the filename is gsuite2mfe_events.json. The path and filename can be changed with the -f | --file option. This can be used with the -t option to only write queried events to the file or without the option to have the events written to the file and sent to the configured syslog host.


-f | --file

This will change the path/filename for the events file. Must be used with the -w | --write option.


-l | --level debug

Enables debug logging to the console. Console output is otherwise disabled during script operation.


-c | --config

Changes the filename and/or path of the config.ini.


-v | --version

Prints the version



Normal operation is for the script wrapper (.bat/.sh) to be called via a task scheduler/cron type tool without any arguments. If there a fatal error the script will exit with a non-zero status.


Basic troubleshooting uses debug mode to enable console output. This can be combined with any of the other flags.

$ python3 -l debug


Retrieve all of the events starting 11/29/2016; don't send them to syslog but write them to file: gsuite2mfe_events.json

$ python3 -t -s 2016-11-29T00:00:00.000Z -w


Retrieve all of the events for the 24-hour window; don't send them to syslog but write them to file: gsuite2mfe_events.json

$ python3 -t -s 2016-11-29T00:00:00.000Z -e 2016-11-30T00:00:00.000Z -w


Populate SIEM with historical events from provided date

  $ python3 -s 2016-11-29T00:00:00.000Z