Command works on Linux Mint terminal, but my sytax are wrong to work under Linux Mint in a starter.
The follow one works on terminal:
gsettings reset org.x.editor.state.history-entry history-replace-with
​
I tryed the follow one on on terminal, later I will use it on starter, but got the follow error message:
bash -c 'gsettings reset org.x.editor.state.history-entry history-replace-with; -c'
bash: -c: Command not found.
​
Any idea?
https://redd.it/11pjnew
@r_bash
Why does this loop exit early?
I have a text file containing a list of file names, one per line, that I want to download from a remote host (a seedbox hosted with feralhosting). The text file contains only partial file names, so I need to find the file on the remote host first. e.g., the text file might have "Miami Connection" and on the remote host it's "Miami Connection (1987).mkv".
Initially I was just doing this: while read i ; do f=$(ssh myhost "ls -1 ~/files/ | grep \"$i\"") ; scp myhost:~/files/"$f" . ; done <file_list
This would download 1 - 3 files then exit (rather than iterate over the full text file as I expected). I'd delete the lines that were downloaded from the list and restart. It would grab a few more files then exit again... The downloads always complete and it would exit after a very random amount of execution time. Nothing appears to be killing it. The job always exits as if it reached the end of the file, but it should be reading more lines.
I'm trying to figure out why it's exiting. I've expanded it into a small script with some diagnostic output and have gotten it down to this (no file transfer so it runs very quickly):
#!/bin/bash
set -x
while read i ; do
unset f
echo "==$i=="
f=$(ssh myhost "ls ~/files/ | grep \"$i\"" | head -1)
if [ $f ] ; then
echo "found $f"
else
echo "couldn't find $i"
fi
done <test
If I comment out the ssh line, it'll iterate over the entire file. If I leave the ssh line, it always stops early. To rule out any weirdness in the text file, I created a new one, making sure it's just plain text: printf "not a file\nmkv\nalso not a file\nnoperino" > test
With the test file it always stops after the first line. The "mkv" line is the only one that should match anything on the remote host. It doesn't matter where I put that in the text file -- the script always stops after line one. Again if I comment out the ssh line, it goes through the whole text file. The output is like:
+ read i
+ unset f
+ echo '==not a file=='
==not a file==
++ ssh myhost 'ls ~/files/ | grep "not a file" | head -1'
+ f=
+ [ -n '' ]
+ echo 'couldn'\''t find not a file'
couldn't find not a file
+ read i
Can anyone explain what I'm doing wrong here/why it won't read the entire file? I'm not really looking for better/alternate ways of doing this. Just trying to understand what's happening here.
https://redd.it/11p4cbr
@r_bash
I'm using the find command to reorganize the mp3 files in a directory but it only half way works??
I have this script:
#!/bin/bash
cd /home/$USER/Music
log="/home/$USER/Documents/logs/spotify-dl.log"
output_dir="/home/$USER/Music/downloads"
echo $(date) >> "$log"
# Discover Weekly
npx spotifydl --download-report --output "$output_dir" "link to a spotify playlist" >> "$log"
# Release Radar
npx spotifydl --download-report --ouptut "$output_dir" "link to a spotify playlist" >> "$log"
find downloads/ -name *.mp3 -exec mv '{}' downloads/ \; && find downloads/ -type d -not -wholename 'downloads/' -exec rm -rf '{}' \;
echo >> "$log"
./downloads/$ARTIST/$ALBUM/$SONG.mp3
./downloads/$SONG.mp3
Is there any sed linter to quickly detect script errors?
It's not helpful in relatively long sed-scripts to see errors that just tell which line number is error on (or just a char number). I wanna something like shellcheck but for sed.
P. S. Maybe this question is invalid, and I should just rewrite code without long sed embedded scripts.
https://redd.it/11oghg1
@r_bash
Question: Bash process substitution with vim
Hi all,
I have a question about whether it's possible to get an interactive vim from inside a process substitution.
The reason I ask is because I had a seemingly simple idea, but unfortunatelly it simply doesn't work.
Example, how I'd expect it to work:
zypper pa -i | grep -E -f <(echo bash | fzf)
fzf
, it will take over controll over your terminal and you can type in your search terms to narrow down matching lines.grep -F -x -f <(tmp="(mktemp)"; vim "$tmp" && cat "$tmp") .bashrc
:wq
, it will be taken as input for the matching.Ctrl-C
out of it, or gives the error message:Vim: Warning: Output is not to a terminal
Vim: Warning: Input is not from a terminal
fzf
do differently to get the fullscreen/interactive priority in the same terminal, but vim
can't do the same?Please help me with this noob multiline cmd argument question
(apologies for not crossposting it properly from r/javahelp, it doesn't let me do that)
I would like to do
cat << EOF | java -jar my.jar
> some stuff
> some more stuff
> EOF
and access the whole thing as the first element of args. But the array is empty.
If hovewer I just do
java -jar my.jar 7
the first element is actually 7. I desperately need to make this work with files or as written in the first example. Please help...
https://redd.it/11nw7v4
@r_bash
finding duplicate files excluding metadata
I am interested in a script/utility that will BULK scan all directories recursively, and if the file is compatible with ffmpeg create a SHA checksum of the data EXCLUDING metadata and write it to a file for later sorting by checkum and removing all unique rows..
It is easy for ID3 tags/flac tags/video tags to change without the underlying file changing. I'd like to be able to detect duplicates where the underlying data is the same but the metadata is different.
It would be great if it also supported JPG EXIF data using exiftag or something similar
Has anyone seen a script in jists/github or similar ?
Cheers
https://redd.it/11nngho
@r_bash
How to block saving sensitive info to history?
I have some regular tasks that involve copy/paste sensitive strings (passwords, etc) to my terminal to encode/decode them (sha256sum, base64, etc). In the process, these sensitive strings are being saved to my bash_history in cleartext, which I would like to avoid!
I can disable my history, that's easy, but I would like to be able to keep this feature.
I already have HISTCONTROL=ignoreboth set, which among other things prevents any command preceded by whitespace from being written to history, which is great for my ad-hoc needs.
Is there any similar option that would allow me to prevent, say, any line beginning with 'echo' from being saved to history? Any hook where I can toss in a regex to determine what does and does not get saved to history?
I could script something to manage my history file, but as I am typically working with my homedir on a NAS with background snapshotting, I would rather the string not get written in the first place.
Certainly not a backbreaking issue, but just seeing if I can squeeze another half a percent of efficiency out of my workflow and take care of an odd but major security issue with how I am working today.
https://redd.it/11n3tvd
@r_bash
Get string field using only bash substitution ?
string="Archwiki 📘 link https://wiki.archlinux.org/index.php?search= care"
Using only bash substitution (meaning no awk, sed, cut, etc), how do I get only the link field $3 "https://wiki.archlinux.org/index.php?search=" ?
https://redd.it/11mte4d
@r_bash
File Test Fails – Issue With Quotation Marks
if ! [ -e "${ISBN} - Book.pdf" ]; then
Gets interpolated to:
if ! [ -e 9780367199692 - Book.pdf ]; then
Condition always resolves to file not found, because the space in the filename breaks the path....
I know this is basic, but I can't figure out how to write shell that will result in the filename quoted:
if ! [ -e "9780367199692 - Book.pdf "]; then
https://redd.it/11ma0ig
@r_bash
remove all but some entries from file
hello,
I have this file and I want to Delete everything in it but "DS *", "RS *", "SA *", "RX *" entries, how do I do that?
I could use grep, sed or jq.
The contents of the file looks for example like this:
"DS109j","DS508","DS408","RS408","RS408RP","DS108j","CS407","CS407e","RS407","DS207","DS207+","DS107","DS107+","DS107e","CS-406","CS-406e","RS-406","DS-106","DS-106e","DS-106j","USB Station","DS-101","DS-101g+","DS-101j","2.5\" Disk Tray (D1)","2.5\" Disk Tray (R1)","2.5\" Disk Tray (R2)","2.5\" Disk Tray (R3)","2.5\" Disk Tray (R4)","2.5\" Drive Tray (R5)","6G eSATA Cable","Adapter 100W_1","Adapter 100W_2","Adapter 10W\/11W_1_EU","Adapter 10W\/11W_1_UK","Adapter 120W_1","Adapter 24W Set","Adapter 24W_1_US","Adapter 30W Set","Adapter 36W Set","Adapter 36W_1","Adapter 42W Set","Adapter 42W_1_AU","Adapter 42W_1_EU","Adapter 42W_1_UK","Adapter 42W_1_US","Adapter 48W\/50W_1","Adapter 60W_1","Adapter 65W\/72W_1","Adapter 65W_2","Adapter 72W_2","Adapter 90W_1","CPU Cooler 40*40*10","CPU Cooler 92*92*25","CPU FAN 40*40*10_1","Cable Infiniband","Cable MiniSASHD_EXT_1","Cable MiniSASHD_EXT_2"
Thanks!
https://redd.it/11lrnzz
@r_bash
Creating a bash script to match multiline patterns in log files
Hi,
I'm trying to automate some time consuming tasks/log checking, building a system that I will replicate to other uses.
I have a logfile for example:
...multiline ACTION Text where all is good...
ERR-101 Something is wrong
ERR-201 Something is wrong with QASDASDASD
INFO-524 Something was wrong
WARN-484 Check line 23
...multiline ACTION Text where all is good...
ERR-101 Something is wrong
ERR-201 Something is wrong with PPOYOYOY
INFO-524 Something was wrong
WARN-484 Check line 23
INFO-524 This is it
I'm creating a check-error.template file:
# This is the template file
ERR-101 Something is wrong
ERR-201 Something is wrong with <TEXT_VAR>
INFO-524 Something was wrong
WARN-484 Check line <NUMBER_VAR>
<?>INFO-524 This is it</?>
Starting with # is a comment, surrounded by <?> are optional (e.g. exist only in the last line).
Text and number will be regexp checked.
If the error matches the template, I know it's ignorable.
I'm not using something advanced (perl, other regexp helpers), as it will be an issue to make sure it exists on every environment.
The following function gets a file and converts the template to regexp pattern
function template2variable {
local file=$1
local var_name=$2
local template=$(sed '/^#/d' "$file")
local pattern="${template//\\/\\\\}" # replace \ with \\
pattern="${pattern//\"/\\\"}" # escape "
pattern="${pattern//<TEXT_VALUE>/([[:alnum:]_]+)}"
pattern="${pattern//<NUMBER_VALUE>/([[:digit:]]+)}"
pattern="${pattern//$'\n'/\\n}"
pattern="${pattern//<?>/(}"
pattern="${pattern//<\/?>/)?}"
printf -v "$var_name" '%s' "$pattern"
}
template2variable "check-error.template" $error_template
Matching template with:
grep -Pzo "${error_template}" $logfile
Doing so, I get back all the template lines I wished.
However, when trying to work with the grep data
using -n lists every iteration with 1
using -c I get line count of 1
using -v results in an empty output
It seems like the match has returned as one giant result instead of several I can iterate over.
What am I doing wrong?
Suggestions for improvement?
Thank you
https://redd.it/11lepmr
@r_bash
Trying to use inotify + cp to move file upon creation.. cp can't reach source
Script:
#!/bin/sh
source='/var/lib/awx/projects/_*/files/'
dest='/var/lib/awx/proj_dep_files'
inotifywait -m --event create,modify,delete /var/lib/awx/projects/_*/files/ --format %w%f | while read path file; do
cp -f $path /var/lib/awx/proj_dep_files/
done
Error:
Setting up watches.
Watches established.
cp: cannot stat '/var/lib/awx/projects/_33__aap_repo/files/RPM-GPG-Key-cisco-amp': No such file or directory
cp: cannot stat '/var/lib/awx/projects/_33__aap_repo/files/debsig_policy': No such file or directory
cp: cannot stat '/var/lib/awx/projects/_33__aap_repo/files/sftd.yaml.j2': No such file or directory
cp: cannot stat '/var/lib/awx/projects/_33__aap_repo/files/snmpd.conf': No such file or directory
Now source and dest have been verified a few times.. perms on source and dest = are 777.I did try just an 'echo $path $file' earlier and the value of $path is the full /dir/path/to/file.name
​
And I can run the inotify and cp parts manually and they work so I'm totally struggle bussing it on this.
​
EDIT: I realize the source and dest variables arent doing anything.. with them I get the same error. This was one iteration of testing.
https://redd.it/11kgfau
@r_bash
bash issue with ip command
hi guys, i am trying to write a bash script that i will assign later to an alias, to output easier ip eth0 and eth0 ip (for openvpn)
the code looks good to me and work when both eth0 and tun0 are enabled but dont when tun0 is disabled
it output something i didnt ask for and should just change the eth0 output to "disabled" instead of being blank
ideas and help would be nice, thanks yall
https://preview.redd.it/ut9x13eed3ma1.png?width=821&format=png&auto=webp&v=enabled&s=9273f918bb620f17010feaf7b72130ae140cd4b5
https://preview.redd.it/dkmyu4ofd3ma1.png?width=457&format=png&auto=webp&v=enabled&s=992ef304e62f96e904be2aca73714b1d455eafbd
https://redd.it/11junh7
@r_bash
Replace multiple lines with multiple lines?
Given the following input file:
text
text
pattern2
pattern3
text
text
pattern1
pattern2
pattern3
pattern2
pattern3
pattern2
pattern3
text
text
I need to search for "pattern1" and then I need to replace the first occurrence of "pattern2" and "pattern3" with "newpattern2" and "newpattern3" respectively and delete all other occurrences of "pattern2" and "pattern3". The end result I'm after is:
text
text
pattern2
pattern3
text
text
pattern1
newpattern2
newpattern3
text
text
At first I thought I could do this with sed by selecting a range of lines starting with "pattern1" and ending with "pattern3" and then using the c command to replace the whole range but range selecting in sed is non greedy so that won't work. However, I could still use sed to do the replacement but then need to figure out another way to delete all other occurances of "pattern2" and "pattern3" that come after "pattern1".
This is running in a busybox environment so no access to perl or most other fancy tools. Basically sed, awk, and grep are the only file/string manipulation tools I can think of that busybox supports.
https://redd.it/11dl5y0
@r_bash
bash-annotations: A bash framework for creating custom injection and function hook style annotations
Source code: https://github.com/david-luison-starkey/bash-annotations
Showcase project: https://github.com/david-luison-starkey/bash-annotations-toolbox
https://redd.it/11p6cs4
@r_bash
What exactly is the difference between an interactive and non-interactive shell? (direct execution vs through ssh)
I was trying to get a script running on several instances using a ssh loop.
Funnily some binaries won't run when executed remotely (ssh myuser@server "binary") but they do when you reference their whole path. This bothers me because the path of the binary is in $PATH (when executed remotely or direct)
The OS/Version/user/... are all the same on all instances.
Can someone explain why this is happening? I guess it has sth to do with interactive/non-interactive shells? What exactly seperates the two? How are user rights and profiles managed in these scenarios?
https://redd.it/11osjrn
@r_bash
Globals or not globals?
Recently, I've implemented my first parser in Bash. It works, but the problem is how slow it works on thousands of files. One of the issues that almost all functions accepts some page as an input and produce some output, which means that they reparse the same page too many times. I don't use global variables now to cache parsed results to use them later. Its speed is not a big issue when just using it for small amount of files.
The question is: whether I should use globals or not? I am asking your opinions about that. I feel that with globals script become more unsafe as they can be accidentally modified. But on the other hand, it can improve performance as I can just retrieve some cached info inside globals instead of reparsing pages. Also, I am questioning about whether storing info in global variables improve script maintainability.
The initial issue is that Bash can't structure data very well. I mean, it doesn't have something like structs or classes, even primitive ones (without accessibility modifiers, just to group some info). I am in doubt whether I've chosen the right language to implement my parser.
P. S. I had an idea to store parsed results as JSON/YAML-formatted strings inside a script and retrieve data via jq/yq, but as I figured out yq may slow down scripts. So that's why I use just sed mostly to do parsing.
https://redd.it/11oj8ky
@r_bash
How I use Bash to automate tasks on Linux
https://www.codelivly.com/how-i-use-bash-to-automate-tasks-on-linux/
https://redd.it/11obrvz
@r_bash
What can you do with bash? can you use bash scripts in accounting/finance?
It seems that a lot of people seem to use bash for things like networking/sys admin work. I really have no idea if I would be interested in that, however (if someone has some resources to see if I would enjoy that kind of work, please feel free to share). I come from a business background so I see a lot of menial things that seem like they could be easily automated. There's a lot of things we do in spreadsheets that I feel are just dirty work. Would bash scripts be the best way to combat this or should I learn a different programming language such as Python or java?
​
Also, if anyone could direct me to a good place to learn bash, that would be much appreciated. Thank you!
https://redd.it/11nxbwe
@r_bash
I can't figure out what they want me to do with this bash script.
My employer said I had to run this script as a docker entrypoint for a postgres docker container.
#!/bin/bash
set -e cat << 'EOF' >> /var/lib/postgresql/data/postgresql.conf # archive options used for backup
wal_level = replica
archive_mode = on
archive_command = 'DIR="/var/backups/$(date +%Y%m%d)-wal"; (test -d "$DIR" || mkdir -p "$DIR") && gzip < "%p" > "$DIR/%f.gz"'
archive_timeout = 60min
#restore_command = 'gunzip < /var/backups/recovered_wal/%f.gz > %p'
EOF psql -v ON_ERROR_STOP=1 --username "$POSTGRES_USER" <<-EOSQL
CREATE USER luca WITH PASSWORD 'luca';
CREATE DATABASE luca;
GRANT ALL PRIVILEGES ON DATABASE luca TO luca;
EOSQL
I am pretty illiterate in bash, but just by looking at it I could tell it was a little bit weird.
Anyways, when running it as a docker entrypoint, the container immediately exits and the docker logs read the following error:
./PostgresScript.sh: line 12: warning: here-document at line 2 delimited by end-of-file (wanted `EOF')
I've personally never seen EOF used like that (I've seen it used mostly like the EOSQL as in the script above). I can't figure out what was the intention behind it or how to fix it.
https://redd.it/11ntrki
@r_bash
Can you force bash to not give a throw a specific error?
I have a function that does something like the following
gg() {
cleanupOnExit() {
declare -p FDall 2>/dev/null && for fd in "${FDall@}"; do
# if FDall has already been defined in the main script,
# send each open fd it contains a NULL and then close it.
[[ -e /proc/$$/fd/${FD} ]] && {
printf '\0' >&${fd}
exec {fd}>&-
}
done
# <...do other cleanup...>
}
trap 'cleanupOnExit' EXIT
local -a FDall
exec {FDall[0]}>./.file0
exec {FDall1}>./.file1
# <...do stuff...>
}
When trying to define/source it, bash throws an error saying that {fd}
is an ambiguous redirect. Now I get why bash is unhappy, since when cleanupOnExit
is defined {fd} would, in fact, be an ambiguous redirect, but there are checks to ensure that bit of code will only ever run when {fd} exists and is an open file descriptor.
Is there a good way to force bash to just ignore this error and source the function anyways? Any other suggestions to make this work?
I did figure out 1 way to work around this, but it is terrible and I really dont want to use it. Basically you create a variable withthe code to setup theexit trap,then the exit trap sources that variable. You cant just have the exit trap as-isthough, since if the scripot exits before the file descriptors are defined in the main script the exit trap (that does other stuff too) wont run. Instead, you have to do something like this:
gg() {
cleanupOnExitSrc="$(cat<<'EOF'
cleanupOnExitSrc0="$(cat<<EOI0
cleanupOnExit() {
$(declare -p FDall 2>/dev/null && {
cat<<'EOI1'
for fd in "${FDall@}"; do
# if FDall has already been defined in the main script,
# send each open fd it contains it a NULL and then close it.
[[ -e /proc/$$/fd/${FD} ]] && {
printf '\0' >&${fd}
exec {fd}>&-
}
done
EOI1
} || echo ':')
}
EOI0
)"
EOF
)"
trap 'source <(echo "${cleanupOnExitSrc}") && cleanupOnExit' EXIT
local -a FDall
exec {FDall[0]}>./.file0
exec {FDall1}>./.file1
# <...do stuff...>
}
which, again, is terrible
https://redd.it/11n2h0w
@r_bash
Comment in the middle of a case statement
I sent my co-worker a shell script snippet and after I copied it to email, I threw in a comment.
I got an email back saying the comment broke the code. Is that possible?
​
case "$1" in
start)
do_something
#comment
;;
stop)
do_something_else
;;
*)
echo "start or stop"
;;
esac
Where's the rule for this? Can a comment go at the end of the line or after the ;;? Google didn't help.
https://redd.it/11mujac
@r_bash
How to hack LD_LIBRARY_PATH to use a recent bash from a Debian sid chroot
I try to get a more up to date version of `bash` from `LinuxMint`.
I have a `chroot` with `Debian Sid` in my box.
What I try to do in a `bash` wrapper script, early in my `PATH`
#!/bin/bash
LD_LIBRARY_PATH=/path/to/chroot/usr/lib/x86_64-linux-gnu:/path/to/chroot/lib:/path/to/chroot/lib64:/path/to/chroot/var/lib:/path/to/chroot/usr/lib:/path/to/chroot/usr/local/lib /path/to/chroot/bin/bash "$@"
But I get:
/home/mevatlave/bin/bash: line 3: 1492488 Segmentation fault (core dumped) LD_LIBRARY_PATH=/path/to/chroot/usr/lib/x86_64-linux-gnu:/path/to/chroot/lib:/path/to/chroot/lib64:/path/to/chroot/var/lib:/path/to/chroot/usr/lib:/path/to/chroot/usr/local/lib /path/to/chroot/bin/bash "$@"
From the chroot:
% ldd /bin/bash
linux-vdso.so.1 (0x00007fff237fc000)
libtinfo.so.6 => /lib/x86_64-linux-gnu/libtinfo.so.6 (0x00007f94de839000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f94de658000)
/lib64/ld-linux-x86-64.so.2 (0x00007f94de9af000)
Is it feasible?
With
LD_LIBRARY_PATH=/path/to/chroot/lib:/path/to/chroot/lib64:/path/to/chroot/var/lib:/path/to/chroot/usr/lib:/path/to/chroot/usr/local/lib /path/to/chroot/lib/x86_64-linux-gnu/ld-linux-x86-64.so.2 /path/to/chroot/bin/bash "$@"
I get
/lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.36' not found
With
LD_LIBRARY_PATH=/path/to/chroot/usr/lib/x86_64-linux-gnu:/path/to/chroot/lib:/path/to/chroot/lib64:/path/to/chroot/var/lib:/path/to/chroot/usr/lib:/path/to/chroot/usr/local/lib /path/to/chroot/lib/x86_64-linux-gnu/ld-linux-x86-64.so.2 /path/to/chroot/bin/bash "$@"
I get:
Segmentation fault (core dumped)
LD_LIBRARY_PATH=/path/to/chroot/usr/lib/x86_64-linux- gnu:/path/to/chroot/lib:/path/to/chroot/lib64:/path/to/chroot/var/lib:/path/to/chroot/usr/lib:/path/to/chroot/usr/local/lib: /lib/x86_64-linux-gnu/ld-linux-x86-64.so.2 /path/to/chroot/bin/bash "$@"
I can run this one:
#!/bin/bash
LANG=C
LD_LIBRARY_PATH=/path/to/chroot/usr/lib/x86_64-linux-gnu:/path/to/chroot/lib:/path/to/chroot/lib64:/path/to/chroot/var/lib:/path/to/chroot/usr/lib:/path/to/chroot/usr/local/lib /path/to/chroot/lib/x86_64-linux-gnu/ld-linux-x86-64.so.2 /path/to/chroot/bin/bash "$@"
But when I run `bash --version`, I get:
Segmentation fault (core dumped)
-
root@debian-sid_chroot:/# dpkg -l | grep libc6
ii libc6:amd64 2.36-8 amd64 GNU C
Library: Shared libraries
ii libc6-dev:amd64 2.36-8 amd64 GNU C
Library: Development Libraries and Header Files
https://redd.it/11mp45n
@r_bash
clevercli: ChatGPT powered CLI utilities.
https://github.com/clevercli/clevercli
https://redd.it/11lwoi4
@r_bash
It seems one can't use FFMpeg sub-folder syntax, './', when specifying list of video/audio files to concatenate in a text file.
I'm in the folder, present working directory, /home/user/Documents/, and have 2 main folders:
File List Text Files
/Split Files/Subfolder/
The first folder contains a text file, FileList.txt, with a list of video I'd like to concatenate/merge into one sequence:
file '/home/user/Documents/Split Files/Subfolder/004.webm'
file '/home/user/Documents/Split Files/Subfolder/005.webm'
Alternatively they can be specified as:
file './Split Files/Subfolder/004.webm'
file './Split Files/Subfolder/005.webm'
The following statement is executed, to join 004.webm and 005.webm into the output file 004005.web:
ffmpeg -f concat -safe 0 -i "./File List Text Files/Filenames.txt" -c copy "./Split Files/Subfolder/004005.webm"
In Windows CMD, I could use, ./Split Files/Subfolder/004.webm
, in FileList.txt, but on Ubuntu Linux, FFMpeg displays the following error:
Impossible to open './File List Text Files/./Split Files/Subfolder/004.webm'
./File List Text Files/Filenames.txt: No such file or directory
It's trying to combing the two directories into one, which wasn't the case with Windows 7 CMD.
https://redd.it/11liy3m
@r_bash
SryRMS: A bash script to help install some popular proprietary as well as libre applications not available in the official repositories of Ubuntu.
https://github.com/hakerdefo/sryrms
https://redd.it/11l7iom
@r_bash
Where can I learn to use NNN as a noob?
What's a good tutorial on nnn for Mac. I'm a terminal noob, and installed the program. It's really fast and I love it for zipping around my folders and doing basic tasks. I've not managed to get previews however. I've spent quite some hours now with ChatGTP to get help but I mostly get inadequate, conflicting, or plainly wrong instructions. The learning curve is steep here as I'm looking at the instructions for installing preview-tui, and I can't quite get my head around it.
This is probably not the right forum for asking, but perhaps someone on here can point me in the right direction? Where can I get help with this stuff? Any advice would be appreciated.
Thanks.
https://redd.it/11kauu4
@r_bash
call PHP Script via bash (instead of cronjob) recursively untill it is finished
How is this possible? At the Moment I am using cronjobs limited to one per min...
https://redd.it/11dkl5n
@r_bash
Question About sed c Command?
Does anyone know if you select a range of lines with sed and then use the c command, will it replace each line in the range with the text passed to the c command? In other words, if 10 lines are in the range will there be 10 changes? Or will it replace all 10 lines with just one copy of the text passed to the c command?
https://redd.it/11dit6q
@r_bash