Judge my code
#!/bin/env bash
# Customize Console
PROMPT_COMMAND='echo -en "\033]0;$(FBD|cut -d "/" -f 4-100)\a"'
now=$(date)
echo "Undertale Jokes, that's all. Mainly. You know, as ya want."
while true
do
#Input
VHS=$(gum input --placeholder " Enter Command")
# Treat Input
case $VHS in
exit | q | esc | bye)
exit
;;
clear | cls)
clear
;;
puns | under)
gum pager < ./src/under.txt
;;
nintendo)
gum pager < ./src/nintendo.txt
;;
help)
gum pager < ./src/cmd.txt
;;
edit | text | txt | file | editor | nano | vim | vi)
$EDITOR $(gum file $FBD)
;;
issue)
echo "Report issue at github.com/FBD/issues"
;;
rules)
gum pager < ./src/rules.txt
;;
time | date)
echo "$now"
;;
duck | goose)
gum pager < ./src/duck.txt
;;
annoy | dog)
gum pager < ./src/dog.txt
;;
*)
echo -n "Command unknown or not implemented yet."
;;
esac
done
I need feedback
https://redd.it/1c1od1h
@r_bash
Judge the setup script
I need feedback for a setup script :
#!/bin/bash
chmod 777 ./src/fbd.sh
sudo pacman -S gum
echo "Setup successful. Execute ./src/fbd.sh to run FBD"
​
https://redd.it/1c1odwx
@r_bash
output not redirecting to a file.
/usr/sbin/logrotate /etc/logrotate.conf -d 2>&1 /tmp/log.txt
Creates an empty file while text flies pas the screen. Why?
https://redd.it/1c1h8o6
@r_bash
sed and tr not changing whitespace when piped inside a bash script
Hey folks, I'm experiencing an odd occurrence where sed seems to be unable to change \s when running inside a bash script. My end goal is to create a csv with newlines that I can open up in calc/excel. Here's the code:
# Set the start and end dates for the time period of cost report
start_date=2024-02-01
end_date=2024-03-01
# Create tag-based usage report
usage_report=$(aws ce get-cost-and-usage --time-period Start=$start_date,End=$end_date --granularity MONTHLY --metrics "UnblendedCost" --group-by Type=TAG,Key=Workstream --query 'ResultsByTime[0].Groups')
# # Format the usage and cost report by resource as a table
report_table=$(echo "$usage_report" | jq -r '.[] | [.Keys[], .Metrics.UnblendedCost.Amount] | @csv')
cleaned_report=$(echo "$report_table" | sed -u 's/\\s/\\n/g')
echo $cleaned_report
In this example, $usage_report is:
[ { "Keys": [ "Workstream$blah" ], "Metrics": { "UnblendedCost": { "Amount": "1369.6332285425", "Unit": "USD" } } }, { "Keys": [ "Workstream$teams2" ], "Metrics": { "UnblendedCost": { "Amount": "5.3844278507", "Unit": "USD" } } }, { "Keys": [ "Workstream$team3" ], "Metrics": { "UnblendedCost": { "Amount": "257.3202611246", "Unit": "USD" } } }, { "Keys": [ "Workstream$team1" ], "Metrics": { "UnblendedCost": { "Amount": "23.2939083734", "Unit": "USD" } } } ]
So that's what comes in and is processed by jq, outputting:
"Workstream$blah","1369.6332285425" "Workstream$teams2","5.3844278507" "Workstream$team3","257.3202611246" "Workstream$team1","23.2939083734"
All I want to do with that output now is to replace the whitespaces with newlines, so I do "$report_table" | sed -u 's/\\s/\\n/g', but the end result is the same as the $report_table input. I also tried using cleaned_report=$(echo "$report_table" | tr ' ' '\n'), but that also produced the same result. In contrast, if I output the results of this script to a test.csv file, then run sed s/\\s/\\n/g test.csv from the command line, it formats as intended:
"Workstream$blah","1369.6332285425"
"Workstream$teams2","5.3844278507"
"Workstream$team3","257.3202611246"
"Workstream$team1","23.2939083734"
Any guidance is appreciated!
https://redd.it/1c0vy1i
@r_bash
What should I do to get over this? I cant find a solution online for this.
https://redd.it/1c0hwdf
@r_bash
Will this work?
I want to parallelize a script, which is under a function() in .bashrc, and have the output logged to a text file. Will this work?
parallel script Log.txt scriptname && exit
https://redd.it/1c0bews
@r_bash
learning bash but why?
Hey guys and gals
Im new to the dev world and really new to bash but im just wondering whats the end game? like what is possible with bash can i change networks with a single .sh file, can i build and compile automation... im having a hard time finding anything about the more advanced process involved.
Just a noob looking to see why i should take the bash game to the end or just get the fundamentals and move on. by all means light me up as a noob but please bring some legit convo about it too.
https://redd.it/1c04j4f
@r_bash
jq with variable containing a space, dash or dot
I have a json file that contains:
{
"disk_compatbility_info": {
"WD_BLACK SN770 500GB": {
"731030WD": {
"compatibility_interval": [{
"compatibility": "support",
}
]
}
}
},
"WD40PURX-64GVNY0": {
"80.00A80": {
"compatibility_interval": [{
"compatibility": "support",
}
]
}
}
},
}
If I quote the elements and keys that have spaces, dashes or dots, it works:
jq -r '.disk_compatbility_info."WD_BLACK SN770 500GB"' /<path>/<json-file>
jq -r '.disk_compatbility_info."WD40PURX-64GVNY0"."80.00A80"' /<path>/<json-file>
But I can't get it work with the elements and/or keys as variables. I either get "null" or an error. Here's what I've tried so far:
hdmodel="WD_BLACK SN770 500GB"
#jq -r '.disk_compatbility_info."$hdmodel"' /<path>/<json-file>
#jq --arg hdmodel "$hdmodel" -r '.disk_compatbility_info."$hdmodel"' /<path>/<json-file>
#jq -r --arg hdmodel "$hdmodel" '.disk_compatbility_info."$hdmodel"' /<path>/<json-file>
#jq -r --arg hdmodel "$hdmodel" '.disk_compatbility_info."${hdmodel}"' /<path>/<json-file>
#jq -r --arg hdmodel "${hdmodel}" '.disk_compatbility_info."$hdmodel"' /<path>/<json-file>
#jq -r --arg hdmodel "${hdmodel}" '.disk_compatbility_info.$hdmodel' /<path>/<json-file>
jq -r --arg hdmodel "$hdmodel" '.disk_compatbility_info.${hdmodel}' /<path>/<json-file>
I clearly have no idea when it comes to jq :) And my google foo is failing at finding an answer.
What am I missing?
https://redd.it/1bzhl55
@r_bash
Why won't it log ps -p?
read -r -p "Enter process name: " cpid
apid=$(pgrep "$cpid")
ps -p "$apid"
read -r -p "Log process yes/no " log
if [ $log == "yes" ]
then
ps -p "$apid" >> pslog.txt # this is where it fails
This is what I get when I run the script:
https://preview.redd.it/vpxc65zn9btc1.png?width=444&format=png&auto=webp&s=d7602ecad132d3ee980f6462a0c72b25f86a4d62
https://redd.it/1bz7mvy
@r_bash
Can you use GNU grep to check if a file is binary, in a fast and robust way?
In another thread, someone mentioned that neofetch is written in bash. I did not know that, so I made a small script to check what interpreters are being used by the executable files in my `$PATH`.
The main problem is testing if the file is text or binary. I found this 10-year-old discussion on Stack Overflow: https://stackoverflow.com/questions/16760378/how-to-check-if-a-file-is-binary
Anyway, here is my script:
#!/bin/bash
time for f in ${PATH//:/\/* }
do
[[ -f $f ]] &&
#checking if file is binary or script, some improvement would be nice
head -c 1024 "$f" | grep -qIF "" &&
value=$(awk 'NR==1 && /bash/ {printf "\033[1;32m%s is bash\033",FILENAME }
NR==1 && /\/sh/ {printf "\033[1;35m%s\033[0m is shell",FILENAME}
NR==1 && /python/ {printf "\033[1;33m%s\033[0m is python",FILENAME }
NR==1 && /perl/ {printf "\033[1;34m%s\033[0m is perl",FILENAME}
NR==1 && /ruby/ {printf "\033[31m%s\033[0m is ruby",FILENAME}
NR==1 && /awk/ {printf "\033[36m%s\033[0m is awk",FILENAME}' "$f")
[[ $value = *[[:print:]]* ]] && arr+=("$value"); unset value
#I first assign file to a $value because if I would have sent it directly to the the array, a '\n' would be added to `arr[]` if awk evalutes to nothing.
#for example, if the file would be written in a language not mentioned in the awk program, like lua, awk would return nothing and then arr+=('\n').
done
files=$(fzf --multi --ansi <<<"${arr[@]/%/$'\n'}" | cut -d " " -f 2) #f2 cause the first field is the ansi escape code for fzf, I guess...
#shellcheck disable=SC2086
[[ $files ]] && "${VISUAL:-${EDITOR:-cat}}" ${files/$'\n'/\ }
Any way to make it faster and more robust?
The idea behind it is to type in fzf `is\ bash` `is\ perl` `is\ shell` `is\ python` to see the numbers of scripts you have for each language in your PATH and if you want multi-select the scripts you want to read the source code in your EDITOR of choice, or it will be printed on the terminal via `cat`
https://redd.it/1byfsce
@r_bash
A small app for coloring text
I didn't know if this was a good subreddit to submit this because it seems to be more focused on bash scripting, but this app was made with bash script development in mind. If it doesn't belong here, please let me know.
​
I made an app a few days ago that makes it easier to style text with ANSI escape codes called Gecko.
It uses a flavor of markup tags found in Spectre.Console. So if you wanted to change the color of text, you would simply use:
gecko "cyan1Hello, World!"
The tag to reset color is [/]
, and unlike Spectre.Console, can be used anywhere.
More information can be found at the GitHub repository: https://github.com/ScripturaOpus/ChameleonTerminal
I'm mostly looking for people to abuse this app so that I can find bugs, but also as a regular release for people to use.
Let me know if it can be useful and what else to add!
https://redd.it/1bxmcad
@r_bash
Getting information about a specific process using process id
How exactly would I go about achieving this in bash?
https://redd.it/1bx9n8a
@r_bash
Copy/backup directories inside multiple directories
I'm making a script to config my setup, It install some packages and copy my cloned .config/ and .bashrc to my home folder, originally this only happened with the current user where I had run the script, but then I wanted this to happen for all users' home folder (or for all users in home?), I was successful in copying the files and direc to all user homes with these commands:echo /home/*/ | xargs -n 1 cp ~/dots/.bashrc
echo /home/*/ | xargs -n 1 cp ~/dots/.config/
But before the script does this, I want it to make a "backup" of current .config/ and bashrc off each created users, I thought something like this would work, but that's not the case:#create .oldconfig/ for each user
echo /home/*/ | xargs -i mkdir {}.oldconfig/
#move the current .config/ and .bashrc From all user to each .oldconfig/ inside their home directory
echo /home/*/ | xargs -i mv # ... incomplete
I tried using this Xtendedargs instead of some loop, as it should be done in a single command (if it worked)
https://redd.it/1bx0s1q
@r_bash
bash script to organize files based off the file extensions.
Trying to organize my files a little bit. How would I go about writing out this script?
https://redd.it/1bw15pq
@r_bash
New to coding
Sorry this maybe the stupidest question any of you have read. But is it post to write a bash script that will run in a Windows OS.
https://redd.it/1bv7nq7
@r_bash
Make the list with size, path and filename
Hello,
find . -type f will show us all files under current directory, how to simple split the path name and filename and add the file sizes in the output for each line
or
du -a . will show all files and directories under current one, how to hide the directories and split the path names and file names in the each line?
thanks
​
https://redd.it/1c1pi0i
@r_bash
Quickly find the largest files and folders in a directory + subs
This function will quickly return the largest files and folders in the directory and its subdirectories with the full path to each folder and file.
You just pass the number of results you want returned to the function.
You can get the function on GitHub [here](https://github.com/slyfox1186/script-repo/blob/main/Bash/Misc/Functions/big-files.sh).
To return 5 results for files and folders execute:
big_files 5
I saved this in my `.bash_functions` file and love using it to find stuff that is hogging space.
Cheers!
https://redd.it/1c1ioa5
@r_bash
An app for finding locations of text within large directories
I made an app called FindIt that makes it easier to find all occurrences of text within large directories and files.
I mainly made this because it's difficult to find where symbols are defined when decompiling .Net applications, but I'm sure it could be used outside of decompiling apps.
More information can be found in the GitHub repo.
https://github.com/ScripturaOpus/FindIt
https://redd.it/1c113uy
@r_bash
What is the utility of read in the following script, and why we put genes.txt in the end of the loop?
https://redd.it/1c0ngvz
@r_bash
trap '... ' ERR not working
I use that:
trap 'echo "ERROR: A command has failed. Exiting the script. Line was ($0:$LINENO): $(sed -n "${LINENO}p" "$0")"; exit 3' ERR
set -euo pipefail
How to extract a single string (containing a ) from a longer string
On KDE/Wayland. In a terminal I run:
kscreen-doctor --outputs
I get a long output:
Modes: 0:2560x1440@60! 1:2560x1440@170\ 2:2560x1440@165......
I want to process the output so I just get the result shown in bold (the active setting). I think I should use grep, but not sure how to get it to select just the part of the output with the *. I searched for using grep to recognize a literal "*", but it's extracting just this bit of the output that I can't think how to approach. Any help appreciated.
https://redd.it/1c041ed
@r_bash
Help with curl script
Hey guys, I have a script to list pages directory. The script is from a course and when the instructor runs in the video class works fine, but when I try to run on my PC it does't work. The test.txt file has a directory I know exists.
#!/bin/bash
for dir in $(cat test.txt); do
httpCode=$(curl -s -H "User-Agent: Teste" -o /dev/null -w "%{http_code}\n" $1/$dir/)
if [[ $httpCode == "200" ]]; then
echo "Directory found: $1/$dir"
fi
done
The var httpCode never gets 200, but when I run curl line in terminal, works fine. Can someone give me a hand here?
https://redd.it/1bzhilw
@r_bash
How can I improve this recursive script?
**tl:dr; Need to handle git submodule recursion, so I wrote this. How can it be better?**
if [ -f ".gitmodules" ]; then # continue...
submodules=($(grep -oP '"\K[^"\047]+(?=["\047])' .gitmodules))
if [ "$1" == "--TAIL" ]; then
"${@:2}" # execute!
fi
for sm in "${submodules[@]}"; do
pushd "$sm" > /dev/null
if [ "$1" == "--TAIL" ] && [ ! -f ".gitmodules" ]; then
"${@:2}" # execute!
fi
if [ -f ".gitmodules" ]; then # recurse!
cp ../"${BASH_SOURCE[0]}" .
source "${BASH_SOURCE[0]}"
rm "${BASH_SOURCE[0]}"
fi
if [ "$1" == "--HEAD" ] && [ ! -f ".gitmodules" ]; then
"${@:2}" # execute!
fi
popd > /dev/null
done
if [ "$1" == "--HEAD" ]; then
"${@:2}" # execute!
fi
fi
The main reasons for this are the coupled use of git-submodules and the maven, where the we have one-overall-project aka the ROOT and it has NESTED submodules, while each submodule is a different maven module in the pom.xml file. This means we need to commit from the deepest edges before their parent-modules, while checking out should be done inversely.
*Yes, I know the 'submodule foreach' mechanism exists*, but it seems to use tail-recursion which does not work for what I'm trying to, though admittedly in a lot of cases it is sufficient.
**If anyone can offer up a better way than the script copying/removing itself, I'd be ecstatic!**
​
https://redd.it/1bz6lb9
@r_bash
why shall IFS be set to "\n\b" and not just "\n" to work with space containing filenames ?
Hi all,
I don't get why simply setting IFS to "\\n" doesn't work .
let's say I have 2 files in the current directory named "big banana" and "huge apple" and "small orange"$ IFS=$( echo -ne "\n" )
for i in $(ls *);
do echo "file: $i";
done
It doesn't work, the output is :
file: big banana
huge apple
small orange
instead of
file: big banana
file: huge apple
file: small orange
As a matter of fact it works if IFS is set the following way:$ IFS=$( echo -ne "\n\b" )
Does someone know why \\b is also needed in IFS definition?
Thanks for your help
https://redd.it/1byha3q
@r_bash
A script to rename folders
Hi! I have posted this : https://www.reddit.com/r/bash/comments/1bu34ld/a\_script\_to\_automatically\_rename\_music\_folders/?utm\_source=share&utm\_medium=web3x&utm\_name=web3xcss&utm\_term=1&utm\_content=share\_button
I want to rename all of the albums's folders of my music library like : Music/Artist/Album (YYYY)/ --> Music/Artist/YYYY_Album/
'YYYY' is the year of release year of the album.
I have now the following script :
#!/bin/bash
for dir in //;do
[[ $dir =~ (.)\ \(([:digit:]{4})\)/$ ]] &&
echo "${BASHREMATCH[0]}" "${BASHREMATCH2}${BASHREMATCH1% }/"
done#!/bin/bash
for dir in //;do
[[ $dir =~ (.)\ \(([:digit:]{4})\)/$ ]] &&
echo "${BASHREMATCH[0]}" "${BASHREMATCH2}${BASHREMATCH1% }/"
done
But it renames from : Music/Artist/Album (YYYY)/ to : Music/YYYY_Artist/Album/
What can I change to get the folders named like : Music/Artist/YYYY_Album/, i.e. YYYY_ to be set before the album's name and not before the artist's?
https://redd.it/1by4w09
@r_bash
A useful yet simple script to search simultaneously on mutliple Search Engines.
I was too lazy to create this script till today, but now that I have, I am sharing it with you.
I often have to search for groceries & electronics on different sites to compare where I can get the best deal, so I created this script which can search for a keyword on multiple websites.
# please give the script permissions to run before you try and run it by doing
$ chmod 700 scriptname
#!/bin/bash
# Check if an argument is provided
if $# -eq 0 ; then
echo "Usage: $0 <keyword>"
exit 1
fi
keyword="$1"
firefox -new-tab "https://www.google.com/search?q=$keyword"
firefox -new-tab "https://www.bing.com/search?q=$keyword"
firefox -new-tab "https://duckduckgo.com/$keyword"
# a good way of finding where you should place the $keyboard variable is to just type some random word into the website you want to create the above syntax for and just go "haha" and after you search it, you replace the "haha" part by $keyword
This script will search for a keyword on Google, Bing and Duckduckgo. You can play around and create similar scripts with custom websites, plus, if you add a shortcut to the Menu
on Linux
, you can easily seach from the menubar itself. So yeah, can be pretty useful!
Step 1: Save the bash script Step 2: Give the script execution permissions by doing chmod 700 script_name
on terminal. Step 3: Open the terminal and ./scriptname "keyword"
(you must enclose the search query with "" if it exceeds more than one word)
After doing this firefox must have opened multiple tabs with search engines searching for the same keyword.
Now, if you want to search from the menu bar, here's a pictorial tutorial for that
Could not post videos, here's the full version: https://imgur.com/a/bfFIvSR
https://preview.redd.it/fbw7y9u4tusc1.png?width=717&format=png&auto=webp&s=bbc5b252419683f1ecf333fffbd389d9edfd16cd
​
https://preview.redd.it/my994k3ktusc1.png?width=714&format=png&auto=webp&s=9e46fa2c059d56351edf965a7f159edf35cdee88
​
copy this, !s basically is a unique identifier which tells the computer that you want to search. syntax for search would be: !s\[whitespace\keyword](https://preview.redd.it/j872qczktusc1.png?width=714&format=png&auto=webp&s=bce94396e4e03c9327de124eedf121b6c554628b)
If your search query exceeds one word use syntax: !s[whitespace]"keywords"
​
https://preview.redd.it/j294497mtusc1.png?width=1667&format=png&auto=webp&s=a00d4340b7ad958fbdf577367170c07fcd36248f
https://redd.it/1bxamwp
@r_bash
Where Can I Find Well-Crafted Code?
I understand the importance of learning from bash scripts written by others, but I want to avoid picking up bad habits or inefficient techniques.
Could anyone recommend a website or resource where I can find high-quality, real-world bash scripts—not just examples?
Thanks for your help!
https://redd.it/1bx8mci
@r_bash
blog Journey of disabling filename expansion for Bash alias
I wanted to disable Bash pathname expansion for my Bash alias:alias g='git'
This ended up being not trivial, as there was trouble lot of different trouble with pipes/stdin. In the end I managed to find working solution and wrote blog post about the steps leading to it:
https://miropalmu.github.io/homepage/bash\_noglob\_for\_alias.html
TLDL:alias g='pstash galias; set -o noglob; consume_noglob_and_pstash galias git'
where the definitions of the Bash function pstash
and consume_noglob_and_pstash
can be found at end of the blog post.
https://redd.it/1bvs0rd
@r_bash
Conditional pipe? or a command that can conditionally pipe data
Got this command in .xinitrc:
xinput list | grep -oP '(AT Translated Set 2 keyboard|DualPoint Stick)\s+id=\K\d+' | xargs -n1 xinput disable
I sometime switch between computers, and in one of theme the keyboard is faulty so I tend to run this to disable it; but it also runs on the non-faulty computer;
I was wondering if there was a way (beside storing in a variable then check) to run xinput disable
on condition that stdin has two lines? (one line to match keyboard's ID and one to match the dualpoint's ID?)
there probably is a simpler way, like checking before starting to pipe, but this is a pattern I run into quiet often and would love to know if there is a way to solve it;
https://redd.it/1bus1fw
@r_bash