Create a Reactjs app on Cloudflare pages in 10 minutes

Create the react app: npx create-react-app imamba

cd imamba

npm start

You can now view the app in the browser, on the urls/ips below on port 3000

edit ./imamba/src/App.js

You should see the app in your web browser update as you make changes

Build the app:

npm run build

Upload the build folder to Cloudflare pages

click on pages, choose a name for your project

Done, the site is live at https://imamba.pages.dev , thank you Cloudflare, next step is to integrate with Cloudflare Functions

Working with CSV’s in SQLITE

I needed to manipulate a largish csv but Excel’s performance was slowing me down. SQLite is a powerful and portable tool that saved the day and made my life a lot easier.

Download SQLite here
run sqlite3 from a terminal / commandline. 
#sqlite3 mytest.db

Import the CSV file:
sqlite> .mode csv
sqlite> .import users-sql.csv users

Check its been imported ok:
sqlite> .schema
CREATE TABLE IF NOT EXISTS "users"(
  "samaccountname" TEXT,
  "DistiguishedName" TEXT,
  "whenCreated" TEXT,
  "lastLogonDate" TEXT,
  "pwdLastSet" TEXT,
  "accountExpires" TEXT,
  "userAccountControl" TEXT,
  "Lookup " TEXT,
  "Enabled" TEXT
);

example query
sqlite> select samaccountName from users;

Now you can go wild and do left joins against other data as well as do fast sql searching and report.

To create a new table:
CREATE TABLE filtered_users AS
select * from users where DistiguishedName NOT LIKE '%OU=Disabled%'
AND DistiguishedName NOT LIKE '%OU=Disabled%'
AND DistiguishedName NOT LIKE '%OU=Groups%'

This will create a new table called filtered_users from the users table.


To Export your SQL query to a csv file:
sqlite> .headers on
sqlite> .mode csv
sqlite> .output export_data.csv
sqlite> SELECT *
   ...>   FROM filtered_users;
sqlite> .quit

Send message to a telegram group with nodejs

requirements:

telegram_msg.js script

Browse to https://web.telegram.com and start a chat with botfather type / and create a new bot, get the bots api token

add the apitoken to line 8 of the telegram_msg.js script

Add your new bot to the group you want to message

Send a message you the group in the web interface.

Now we need to find the groups chatid,

Browse to: https://api.telegram.org/botXXX:YYYY/getUpdates

replace XXX;YYYY with the api you received above

In the output you should see the chat id in the json return above:

now you can run

node telegram_msg.js <chatid> “testing”

and the message testing will get sent to the group

Nagios – Sending sms notifications through twilio

I really like nodejs so am going to use it to send alerts from nagios to users mobile phones via sms. I normally use telegram for this but there was a requirement for sms.

Get the twilio_sms.js script here:

https://raw.githubusercontent.com/zs1rcm/twilio_sms/main/twilio_sms.js

This will allow you to send sms’s by typing:

node twilio_sms.js <number> <text>

In Nagios edit /etc/nagios/misc_commands.cfg and add the following two lines:

define command {
command_name notify-by-twilio-sms
command_line /etc/nagios/twilio/twilio_sms.js $CONTACTPAGER$ "[Nagios] $NOTIFICATIONTYPE$ $HOSTALIAS$/$SERVICEDESC$ is $SERVICESTATE$"
}
define command {
command_name host-notify-by-twilio-sms
command_line /etc/nagios/twilio/twilio_sms.js $CONTACTPAGER$ "[Nagios] $HOSTSTATE$ alert for $HOSTNAME$"
}

next edit /etc/nagios/contacts.cfg

define contact{
contact_name rm-sms2
alias Rich Mobile
service_notification_period 24×7
host_notification_period 24×7
service_notification_options c,r
host_notification_options d,r
service_notification_commands notify-by-twilio-sms
host_notification_commands notify-by-twilio-sms
pager +xxxxxxxxxx //put users mobile number here
}

Quick and Easy Kubernetes Cluster setup

:

These are my k3s notes, this post needs more work but noting down what I have done

What is k3s?

K3s is a lightweight Kubernetes distribution created by Rancher Labs, and it is fully certified by the Cloud Native Computing Foundation (CNCF). K3s is highly available and production-ready. It has a very small binary size and very low resource requirements.

Requirements

  • I tested this on Oracle Linux 7.9 as well as k3os but it should run on everything
  • A couple of VM’s

Open up the Firewall for k3s

Firewall Rules
firewall-cmd –permanent –add-port=22/tcp
firewall-cmd –permanent –add-port=80/tcp
firewall-cmd –permanent –add-port=443/tcp
firewall-cmd –permanent –add-port=2376/tcp
firewall-cmd –permanent –add-port=2379/tcp
firewall-cmd –permanent –add-port=2380/tcp
firewall-cmd –permanent –add-port=6443/tcp
firewall-cmd –permanent –add-port=8472/udp
firewall-cmd –permanent –add-port=9099/tcp
firewall-cmd –permanent –add-port=10250/tcp
firewall-cmd –permanent –add-port=10254/tcp
firewall-cmd –permanent –add-port=30000-32767/tcp
firewall-cmd –permanent –add-port=30000-32767/udp

firewall-cmd –reload

export INSTALL_K3S_SKIP_SELINUX_RPM=true
export INSTALL_K3S_SELINUX_WARN=true

Create the master node

curl -sfL https://get.k3s.io | sh –

Get the node token of the master node

cat /var/lib/rancher/k3s/server/node-token

Join the Worker to the master node

curl -sfL https://get.k3s.io | K3S_URL=https://ipofmasternode:6443 K3S_TOKEN=<insert token here> sh –

run kubectl get nodes, if both say ready your cluster is up.

Install Portainer

Portainer seems to be quite a good way to manage this infrastructure as well as get to grips with the backend

To install it run:

kubectl apply -n portainer -f https://raw.githubusercontent.com/portainer/k8s/master/deploy/manifests/portainer/portainer.yaml

It should now be available on https://clusterip:37000

Auditing Active Directory Passwords

What you will need:

  1. Admin Access to your Active Directory
  2. A linux server with secretsdumps from impacket and hashcat, in this example I had a kali vm
  3. A Password list, on Kali there should be some here /usr/share/wordlists/ , I used rockyou.txt

Step 1: Dump NTDS Database

On a domain controller run the following:

powershell "ntdsutil.exe 'ac i ntds' 'ifm' 'create full c:\temp' q q"

This command will generate two folders in c:\temp , Active Directory and Registry

output of ntdsutil command

Step 2: Extact the hashes from the ntds.dit file

Copy/scp your file over to a linux machine and throw it at secretsdump, to extract the hashes

On kali this is what we ran:

impacket-secretsdump -ntds /root/ntds_cracking/ActiveDirectory/ntds.dit -system /root/ntds_cracking/registry/system LOCAL -outputfile ntdshashes.txt

Step 3: Cleanup hashes, this creates a file with the username and ntlm hash for cracking:

cat ntdshashes.txt | cut -d : -f 4 |sort|uniq > cleanhashes.txt

Step 4: Run hashcat against hashes

hashcat -m 1000 cleanhashes.txt /home/zs1/rockyou.txt

View the cracked hashes

cat /root/.hashcat/hashcat.potfile

140e2a025b0a93dc13720d19e935a918:Password3! 7a829d816a477655abe98a8c7de84c99:Password2@ 07d128430a6338f8d537f6b3ae1dc136:Password2! 43460d636f269c709b20049cee36ae7a:Password1@

Active Directory User Report

I needed a quick way to audit Active Directory accounts, the powershell script dumps the following active directory attributes:

samaccountname	 
DistiguishedName	 
whenCreated 
lastLogonDate 
pwdLastSet 
accountExpires 
userAccountControl
Enabled

For the audit the userAccountControl attribute is very useful, this attribute contains a code that maps back to the users account status and password change requirements. These are the codes we are interested in:

    512 =  "NORMAL_ACCOUNT"
    514 = "ACCOUNT_DISABLE_NORMAL_ACCESSS"
    544 = "NORMAL_ACCOUNT_PASSWORD_NOT_REQUIRED"
    546 = "ACCOUNT_DISABLED_NORMAL_ACCOUNT_PASSWORD_NOT_REQUIRED"
    66048 = "NORMAL_ACCOUNT_DONT_EXPIRE_PASSWORD"
    66050 = "ACCOUNT_DISABLED_NORMAL_ACCOUNT_DONT_EXPIRE_PASSOWRD"
    66080 = "PASSWORD_NOT_REQUIRED_NORMAL_ACCOUNT_DONT_EXPIRE_PASSWORD"
    590336 = "NORMAL_ACCOUNT_DONT_EXPIRE_PASSWORD_TRUSTED_FOR_DELEGATION"

Here is some more information on the codes from Microsoft:
https://learn.microsoft.com/en-us/troubleshoot/windows-server/identity/useraccountcontrol-manipulate-account-properties


Here is the powershell script we use to extract this data for analysis, it creates a csv file which we import to excel for analysis.
https://github.com/zs1rcm/powershell-scripts/blob/main/ExtractAccountData.ps1