bash code to waste bandwidth
so i don't know if this belongs here, but i've been trying some code with gpt to try and waste bandwidth, the best we came up with was this:
while true; do
sudo arp-scan --localnet | grep -oE '(0-9{1,3}\.){3}0-9{1,3}' | xargs -I % sudo ping -f %
done
but it didn't quite do much. it slowed down the entire network a small bit but that's all. anyone got code that could help me waste bandwidth? in bash, so mostly simple like the one i put here
https://redd.it/16ydtpq
@r_bash
Seeking help understanding a request for Bash script for an interview.
I have an interview question for a sysadmin job and need clarification about what's being asked. Is it just me, or does this make sense? I'm wondering what's expected here as I need clarification. What are the parameters if they are not defined? For example, how can I write a script if there are no values specified for each of these parameters?
## Instructions
Using a language of your choice, write a script that can be used as either a scheduled windows task or a cron job to delete files and/or directories using the parameters listed below:
1. File Age
1. You’re free to choose what metadata to use to determine file age
2. File Location
3. File Size
4. File Type (extension)
5. Delete folders and files
6. Delete files only
https://redd.it/16y6s5z
@r_bash
Running commands over cloud machine
We are automating a cloud based infrastructure. There are linux machines installed over there and I need to implement some tasks which will run linux commands on that machine.
1. Is there any doc or anything where I can find the all possible outcome of an linux command. I need this so that later the debugging becomes somewhat easy.
2. What is the best practice to implement this type of task. I am running a command and if it fails, storing the log on failure_logs and continuing for next iteration other wise moving ahead to next command. In case if there is no point of moving to next iteration I am raising an exception and catching.
https://redd.it/16xyjzr
@r_bash
Calling user bash from xargs - work around
Ques: Never seen this combination of read & find .. it works but is it common?
The problem I was trying to solve calling function within the same script with one parameter from find and giving some additional vars. Function was exported (export -f do_stuff)
$dothis=something
$dothat=whatever
find $startdir -type d | xargs -n1 bash -c 'dostuff "$1"' - |
That will pass the directory to do_stuff but I couldn't figure out how to pass it $do_this and $do_that at the same time, gave up on that and the -exec option as well.
Found a workaround the does the job.... something I had never seen before, looks weird
while read dir ; do
dostuff $dir $dothis $dothat
done< <(find $startdir -type d -print)
​
https://redd.it/16xs04n
@r_bash
Some tricky regex and graphviz docs later, we have a decent script
A vimwiki graph generator using the dot language and graphviz, written in BASH.
Supports two layouts and more can be added.
Instead of a plain white elongated chart that all other such scripts generate, this one uses the SFDP or NetworkMap layouts along with some custom coloring. Something along the lines of obsidian's graph.
link
Cheers.
https://redd.it/16xkrzf
@r_bash
/sbin/brltty launching randomly
Hey guys,
I’m trying to start writing some scripts for school in my ubuntu 18.04 VM and while executing it, suddenly while doing ./script1.sh i get the following error:
/sbin/brltty: failed to execute /sbin/brltty
I don’t use it nor did i know of its existence up until today. Any help?
https://redd.it/16x3www
@r_bash
UI in Next Generation Shell
https://blog.ngs-lang.org/2023/09/30/ui-in-ngs/
https://redd.it/16wthsm
@r_bash
A simple bash script if all you need is a simple (really, really simple) website, Zite!
Hello y'all, just thought I'd share a cool little project I've been working on called Zite. It is a simple static website generator using bash and pandoc. The main reason I created was because of the frustration of trying to use tools like Hugo and Zola for creating my personal website. While I do think they're great, I've always felt like all I needed for the website I wanted was a few lines of scripting, so that's what I did!
The github link is https://github.com/rodrigueslazaro/zite if anyone wants to check it out. I'm no bash, or programming, expert, so I'd be glad if anyone wants to contribute to the project! I know there are some improvement that can be done, but the functionality is pretty much how I want to.
​
https://preview.redd.it/et1rg77flhrb1.png?width=963&format=png&auto=webp&s=81a9b2bbb63a9c73b0a741f4413502483b5cae62
​
https://redd.it/16wmw1z
@r_bash
Variable with double quotes in cURL header
Hello all,
I've been trying to figure something out here for a bit. I am pulling an Etag from a request header and then attempting to use that in the "If-Match" to patch.
export ETag=$(curl -I -X GET https://{uri} -H "Authorization: Bearer ${BEARER}" 2>/dev/null | grep Etag | head -1 | cut -d":" -f2)
The tag gets stored as expected but contains double quotes: W/"12345678910"
When passing in a patch, I am struggling with how to format it.
curl --location --request PATCH https://{uri} \
-H "Authorization: Bearer ${BEARER}" \
-H "Content-Type: application/json" \
-H "If-Match: ${ETag}" \
--data '{}'
Since the tag itself contains double quotes, I am finding it difficult in how to pass it to the header for the match. Does anyone have any ideas on how I can get around this? TIA.
https://redd.it/16vcmvh
@r_bash
I want to add a crontab for ec2-user but it keeps adding it under root, how do I fix this?
* I am running some user data inside my EC2 instance which will always run as the root user as per AWS documentation
* I supply the EC2 instance a bunch of bash commands on launch and one of them is to invoke this file run-cron.sh
In terms of code it looks like this
cd "${ROOT_PATH}" || exit
git clone -b test/cron "${UTILS_REPO_URL}"
chown -R ec2-user:ec2-user ./utils
# https://askubuntu.com/a/889348/968824
find "${ROOT_PATH}/utils" -type f -iname "*.sh" -exec chmod +x {} \;
# shellcheck source=/dev/null
bash "${ROOT_PATH}/utils/src/ec2/run-cron.sh"
My run-cron.sh file looks like this
set -o pipefail
set -u
set -x
IFS=$'\n\t'
# https://stackoverflow.com/a/52879454/5371505
crontab <<EOF
0 0,4,8,12,16,20 * * * ec2-user /home/ec2-user/utils/src/elasticache/backup-elasticache-to-s3.sh > /tmp/backup-elasticache-to-s3.log 2>&1
0 0,4,8,12,16,20 * * * ec2-user /home/ec2-user/utils/src/rds/backup-rds-to-s3.sh > /tmp/backup-rds-to-s3.log 2>&1
EOF
My problem is that after my user data script runs, I logout of EC2 instance and log in again as ec2-user. When I run crontab -l it says no crontabs found for ec2-user. When I run sudo crontab -u root -l it shows the above jobs.
How do I add the crontab for ec2-user when running as root from user data script?
https://redd.it/16uh7jz
@r_bash
ndarray: tools for setting up and using N-dimensional / nested arrays in bash
A recent post here inspired me to pick up an old personal project for getting bash to work with N-dimensional/nested arrays. I got it working, so I figured id share it.
The [CODE](https://github.com/jkool702/bashndarray/blob/main/ndarray.bash) is on github. There are 5 functions:
`nd_usage` gives a brief usage example
nd_create
sets up the nameref framework and declares the arrays
`nd_set` writes data into the arrays at the end on the namerefs (the `A_0` and `A_1` arrays in the simple example above)
nd_get
reads data out of the arrays. You can define lists/ranges on indices for any dimension and it will output all the data that falls into the n-dimensional slice of the array.
nd_clear unsets all the array and nameref variables
METHODOLOGY
It involves creating a framework of nameref arrays to handle all the dimensions except the last one (which is saved in the arrays themselves. The idea is to do something like
declare -n a_0='A_0'
declare -n a_1='A_1'
A=(a_0 a_1)
A_0=(1 2 3)
A_1=(4 5 6)
So to get the data at (1,2), you do `${A[1]}` which gives `a_1` which namerefs to A_1 then `${A_1[2]}` which gives the actual data. The use of the `a_1` and `a_0` are because bash doesnt directly support doing, say, `declare -n A[0]=A_0`...you have to nameref a dummy variable and then store that in an array.
USAGE EXAMPLE
this is the example that running `nd_usage` prints
# # # # # generate nameref framework.
# note: dont include the last dimension
source <(nd_create -a A 2 3 4)
# # # # # set array values
# pass data to be set on STDIN, and use function inputs to define basename + index ranges
source <(seq 1 $(( 2 3 4 5 )) | ndset A 0:1 0:2 0:3 0:4)
# # # # # extract various slices from the array
ndget A 0 \@ \@ \@
1 2 3 4 5
6 7 8 9 10
11 12 13 14 15
16 17 18 19 20
21 22 23 24 25
26 27 28 29 30
31 32 33 34 35
36 37 38 39 40
41 42 43 44 45
46 47 48 49 50
51 52 53 54 55
56 57 58 59 60
ndget A \@ 0 \@ \@
1 2 3 4 5
6 7 8 9 10
11 12 13 14 15
16 17 18 19 20
61 62 63 64 65
66 67 68 69 70
71 72 73 74 75
76 77 78 79 80
ndget A \@ \@ 0 \@
1 2 3 4 5
21 22 23 24 25
41 42 43 44 45
61 62 63 64 65
81 82 83 84 85
101 102 103 104 105
ndget A \@ \@ \@ 0
1
6
11
16
21
26
31
36
41
46
51
56
61
66
71
76
81
86
91
96
101
106
111
116
# # # # # cleanup
ndclear A
https://redd.it/16tvadk
@r_bash
Hi, I'm sharing The dotfiles manager+ written in pure bash
https://github.com/yunielrc/ydf
https://redd.it/16tr45b
@r_bash
rm function does not work
I am trying to delete some text files but I am getting this error :
rm: /path/to/files/* No such file or directory
This is the actual command :
rm -r $PATHTOFILES”*”
Used * wildcard because my usecase requires to all files deleted regardless of file extension.
I am running an airflow DAG to call a shell script that delete all files from a local directory, then pull files from an SFTP server and place them within the same directory. Therefore it is the same airflow user creating the directory and placing the files.
Would appreciate any advice, thank you!
https://redd.it/16thf8z
@r_bash
Need help with a script - Finding out when my car is in the driveway
My car has a Dashcam that will let me download files. There is already a way to synchronize my BlackVue dashcam with a local directory over a LAN. What I want to do is place a laptop in my living room and every 5 minutes see if it can see the Wi-Fi network that my Blackvue camera creates. If it does I want it to connect to that network then call the script to start transferring files.
I am not very knowledgeable when it comes to scripting.
This is what I have written so far. I would appreciate help in cleaning it up.#!/bin/bash
DashcamAP="" # Name of the wireless network I am looking for
DashcamPA="" # Password of the wireless network I am looking for
# <<Code to drop all wireless networks>>
echo "Looking for network called " $DashcamAP
if test -e "wifiscan.lock"
then
echo "Wifi Scan done Less than 5 minutes ago"
rm -f wifiscan.lock # to delete the lock file
rm -f wifi.list # to delete the list of found networks
rm -f AmIHome # to delete the flag file if the car was home
sleep 5m
touch wifiscan.lock # To show the script was recently run
echo "Preparing to scan wif"
sh ./wifitest.sh
else echo "Wifi Scan not done. Scanning now"
touch wifiscan.lock
nmcli dev wifi rescan
nmcli -t -g SSID dev wifi > wifi.list
grep -i $DashcamAP wifi.list > AmIHome
if [ -s AmIHome ];
then touch IAmHome
fi
rm -f wifiscan.lock
if test -e "IAmHome"
then
echo "Network is found"
# << Code to connect to the wireless network >>
# sh ./BlackVueSyncScript -- Will call the SyncScript
else
echo "Network not found"
fi
fi
​
​
https://redd.it/16scc1q
@r_bash
Help with a script
Hi all, I've decided to write a script for a lengthy process at work. I have a main box which I store driver packs for multiple devices, I'd like to copy the content of this directory into other servers.
My plan is to add the new driver files when we get a new model, then run this script to copy to the other servers. I'd also like for this to work for when I update existing driver packs. I do not want to copy content that already exists. How can I do this with the following script I have started? Hope this makes sense.
​
#!/bin/bash
#source directory
sourcedirectory="/images/drivers"
#remote username
remoteuser="username"
#array of remote server IPs
remoteservers=("10.xx.xx.xxx" "10.xx.xx.xxx")
#Destination directory on remote servers
destinationdirectory="/images/drivers"
#specify the user and group ownership for the copied directories
ownership="name:name
#Loop through each remote server and copy the contents
for server in "${remoteservers[@]}"; do
rsync -avs --ignore-existing --chown="$ownership" "$sourcedirectory/" "$remoteuser@$server:$destinationdirectory/"
done
​
​
https://redd.it/16rw9kl
@r_bash
Copy all folders that start with a capital letter
What would the command be to copy all folders and their contents in the current directory that start with a capital letter to another folder?
I've looked, but haven't seen a clear example that answers this question.
For example, if the current directory contains the following folders:
Blue
greEn
Red
The folders that start with a capital would be copied to a folder called colors.
https://redd.it/16ya63s
@r_bash
Problem with the AND operator
Hello, sorry for my bad english, not my first language
I am creating a bash script that chooses a random number and compares it with a number entered by the user.
I managed to do a part but when I tried to perfect it it started giving me errors.
My idea was that when entering a number it would be verified that it was an integer AND if it was an integer it would be verified that that number was equal to the random number of the "numal" variable.
For the second elif the same process but verified that "numal" does not match the number entered.
And if it was neither of the two that meant that it was not an integer and it gave you an error message.
Now regardless of whether I enter a number or a letter I always get the error message.
What am I doing wrong?
Here is my script.
#!/usr/bin/env bash
read -p "Guess the number I'm thinking: " unum
numal=$((1 + $RANDOM % 5))
re='^0-9+$'
if [ $re =~ $unum ] && [ $numal == $unum ]; then
echo "The random number is $numal"
echo "You guessed the number correctly"
elif [ $re =~ $unum ] && [ $numal != $unum ]; then
echo "The random number is $numal"
echo "You couldn't get the number right"
else
echo "You have entered incorrect parameters"
fi
https://redd.it/16y3z24
@r_bash
Calling user bash from xargs - work around
Ques: Never seen this combination of read & find .. it works but is it common?
The problem I was trying to solve calling function within the same script with one parameter from find and giving some additional vars. Function was exported (export -f do_stuff)
$dothis=something
$dothat=whatever
find $startdir -type d | xargs -n1 bash -c 'dostuff "$1"' - |
That will pass the directory to do_stuff but I couldn't figure out how to pass it $do_this and $do_that at the same time, gave up on that and the -exec option as well.
Found a workaround the does the job.... something I had never seen before, looks weird
while read dir ; do
dostuff $dir $dothis $dothat
done< <(find $startdir -type d -print)
​
https://redd.it/16xsdwl
@r_bash
Weird behavior of jobs/awk command
I'm trying to catch all the background processes belonging to a certain tmux pane and kill them in one command.
For example, if I have 4 background jobs and be using the jobs -rp
the outputs would be
3 3701605 running bash -c "sleep 360"
4 3701606 running bash -c "sleep 360"
5 - 3701607 running bash -c "sleep 360"
6 + 3701610 running bash -c "sleep 360"
However when I run jobs -pr | awk '{print $3}'
it would output
running
running
3701607
Or when I use jobs -pr | cut -c7-
it would output
3701605 running bash -c "sleep 360"
3701606 running bash -c "sleep 360"
3701607 running bash -c "sleep 360"
which completely disregard the last line.
Does anyone have any fix ?
https://redd.it/16xm9sb
@r_bash
Bash / sed on Mac adds ^M when appending to a line
I'm writing a bash script to append additional tab-delimited fields to (initially) the end of the first/header line of a text file on Mac. It works, but sed adds a \^M and then the added fields to the end of each line, which I can only see when I open the file with Vim. It also adds the \^M to the end of all the other lines, even though they don't match.
How do I tell sed NOT to add the \^M?
Here's my sed command:
sed '1s/$/\\tField1\\tField2$/' TESTDATA.txt
https://redd.it/16x8cez
@r_bash
How to delete data from 2 files and output the result
How do I tell it to remove IP in these 2 file wl.txt and bots.txt from the output file it only reads the first file also need to clean up a file find ips in ipban.txt and remove them from ipattack.txt also want to add if an IP ends with .0 to add /24 i.e: 10.10.10.0 make it 10.10.10.0/24 please add your code to the code below. Thanks
curl -sk $GLOBAL $GLOB $IDP $ATTACK $SRX |\\grep -P -o '((25[0-5\]|(2[0-4\]|1\\d|[1-9\]|)\\d)\\.?\\b){4}(/(3[0-2\]|[12\]?\\d))?\\b' |\\#awk 'NR > 0 {print $0}' > spamhaus_transformed.txtawk 'NR > 0 {print $1}' | sort -u | grep -F -v -f ipban/wl.txt | grep -F -v -f ipban/bots.txt > ipban/ipattack.txt
https://redd.it/16x15yh
@r_bash
Bash menu to execute different functions
I have a small tool to help install stuff without running all the commands.
What I'm trying to do is now that I have all the functions, I am creating a menu so that each item can be selected, and then that particular function is executed, then the menu appears again.
Here is some code to get a feel
bInstall_program_A=false
bInstall_program_B=true
function run_program_A() {
# execute command
echo -e "Finished"
}
function run_program_B() {
# execute command
echo -e "Finished"
}
if [ "$bInstall_program_A" = true ] ; then
items+=('Program A')
fi
if [ "$bInstall_program_B" = true ] ; then
items+=('Program B')
fi
title="Selection Menu"
prompt="Select an option\n\n"
while item=$(zenity --list \
--width="430" \
--height="335" \
--title="$title" \
--text="$prompt" \
--column="Installable Packages" "${items[@]}")
do
case "$item" in
"${items[0]}") echo "Selected $items[0], item #1";;
"${items[1]}") echo "Selected $item, item #2";;
"${items[2]}") echo "Selected $item, item #3";;
*) echo "Invalid option.";;
esac
done
Provide password securely to shell script
Is there a built in command that can provide a password to a shell script? Thought I could use read
but it can be sniffed. https://unix.stackexchange.com/questions/563718/sniff-password-entered-with-read-and-passed-as-a-command-line-argumentsystemd-ask-password
looks like it is about system passwords not for scripts?
Any options using a command that is part of bash or available on most distros?
https://redd.it/16voh9d
@r_bash
Can you guess the output of these tr(1) commands?
echo abcdefghijklmnopqrstuvwxyz
echo abcdefghijklmnopqrstuvwxyz | tr -d [:blank:\]
echo abcdefghijklmnopqrstuvwxyz | tr -d '[:blank:\]'
Now that you've tried it, and assuming you got what I did, how do you explain >!the missing letter l in the 2nd command!<?
https://redd.it/16ums5y
@r_bash
String substitution
I am new to Bash and am having a difficult time employing string substitution. In my code below, I am attempting to replace file paths that contain .csv
or .xlsx
extensions with the .json
extension and the output for $output
is always /path/to/input_files/file.csv//.csv/.json
or /path/to/input_files/file.xlsx//.xlsx/.json
. Could anyone help me to understand how I am using string substitution incorrectly here?
#!/bin/bash
for f in /path/to/inputfiles/*.csv /path/to/inputfiles/.xlsx
do
if [[ $f == ".csv" ]]; then
$output={$f//.csv/.json}
fi
if [ $f == *".xlsx" ]; then
$output={$f//.xlsx/.json}
fi
echo $output
python3 API.py "$f" "$output" args
done
​
https://redd.it/16u2eoz
@r_bash
Combined output of commands into variable
I know I can do
value=$(echo "FOO"; echo "BAR")
to get the combined output of the command group and set it to the variable. AFAIU, the commands will be run in a separate shell. Now, I was wondering whether it would be possible to achieve the same thing without starting a new shell, but I couldn't find the right syntax. The closest I got was
value=$({ echo "FOO"; echo "BAR"; })
but I suppose this is not what I wanted. This will run the command group locally in a new shell. The goal was to prevent the creation a new shell in the first place.
Any ideas?
Thanks.
https://redd.it/16tuegx
@r_bash
How to run 2x readarray ... < <(...) in parallel ?
Hi folks,
readarray -d '' a < <(my_function)
readarray -d '' b < <(my_function)
my_function
runs complicated find ... -print0
.How to filter files with glob patterns?
Lets say i have these files in dir
directory.
file1.jar
file2.jar
file3suffix.jar
file4.txt
...
fileX.someotherextension
I need all `*.jar` files execpt those ended with `suffix.jar. I've read this [https://www.gnu.org/software/bash/manual/html\_node/Pattern-Matching.html#Pattern-Matching](https://www.gnu.org/software/bash/manual/html_node/Pattern-Matching.html#Pattern-Matching)
ls dir/.jar` gives me all jars
... and:
*
ls dir/!(_suffix.jar
) gives me good jars but also files with other extensions
Preferably i would somehow merge these 2 together.
​
https://redd.it/16srr4t
@r_bash
Video Stripe Preview Generator
# Hello Everyone,
I just finished making a script to generate a striped preview image of a video (mp4, mkv, etc.) or image-sequence (gif, etc.) (with the help of FFmpeg), I'll definitely make it better going forward. For now, I'm just trying to debug and hunt down exceptions states and anomalies.
So here's the **REPO** for my Script, have at it and let me know how it performed, and if you find any odd behavior do let me know so that I can patch it up. And I'm also up for a good suggestion.(I know the Script looks bad and a bit UnOptimized and has a lot of sanity checks, but right now my priority is to find all exception/error states and handle it)
​
# Some Preview:
Command : video-stripe-preview -vf "WING IT - Blender Open Movie.mp4"
​
Default parameters
Command : video-stripe-preview -r 2 -c 4 -l 960 -vf "WING IT - Blender Open Movie.mp4"
​
Row = 2 | Column = 4 | Width = 960
Command : video-stripe-preview -r 5 -c 2 -vf "WING IT - Blender Open Movie.mp4"
​
Row = 5 | Column = 2
​
# Credits :
**WING IT !!** — An Open Film from BlenderStudio">*blender Studio* was used to generate previews.
# Note:
I'm kinda new to the whole Linux, git, CLI, FFmpeg, etc. so feel free to be informal with the discussion, we'll probably need to have many back and forth before I come to a conclusion.
https://redd.it/16rz17d
@r_bash
Linux terminal practice similar to w3 schools
Linux terminal practice similiar to w3 schools style
Hopefully this is the best subreddit for some good ideas/discussions.
I have a number of first line support members in my team where this is their first IT role and they come from a more customer/adminiatration style background.
Our product runs off Linux but the nature of their role means they wont need to interact with it themselves.
Due to the nature of the company i'm not able to provide them access to any Linux servers or even VMs to play with.
I'm looking for something similar to w3 schools where they can become familiar with basic Linux terminal commands e.g. ls, pwd,cd but within a web browser form.
We have a fair bit of down time on the job and so I want to pique their curiosity, get them in the habbit of self learning and also give them some exposure of what the 2nd line team does...Any ideas?
https://redd.it/16rlxl7
@r_bash