Today I'm asking about your scripts.
Almost every engineer I know has a collection of scripts/utilities for automating ${something}. I'm willing to bet that HN users have some of the most interesting scripts on the internet.
So that said, could I please see your scripts?
I'll go first: https://github.com/fastily/autobots
On first connect to a server, this sync all the dotfiles i want to a remote host and on subsequent connects, it updates the dotfiles.
Idk if this is "special", but I haven't seen anyone else do this really and it beats for example ansible playbooks by being dead simple.
What a shame!
I owe you a beer at least
I'm not crying, you are crying!
Why didn't you look this up yourself?
Also: This will be great to combine with chezmoi for bootstrapping workstations - allowing you to do host-specific configuration, templating, and basic secrets injection without fiddling around with USB drives or whatnot.
Command help, inspired by http://explainshell.com/ to extract help text from builtin commands and man pages. Here's an example:
---https://github.com/learnbyexample/regexp-cut/blob/main/rcut is another - uses awk to provide cut like syntax for field extraction. After writing this, I found commands like `hck`, `tuc` written in Rust that solves most of the things I wanted.
https://github.com/tpapastylianou/misc-updater
In full honesty, I'm as proud of the "MISC" acronym as of the script itself. :p
I'm secretly hoping the acronym catches on for referring to any stuff outside the control of a system's standard package management.
E.g., you may note that one of the packages the script checks for is zoom. Zoom is installed as a normal .deb file that I download from the zoom website, and install manually using dpkg, which installs it in the normal system directories. But it has its own cumbersome update check mechanism (which involves clicking a menu in the app), and is not picked up by apt-update because it's not a repo package. So this makes it a good candidate for a misc-updates check, even though it's installed as a normal .deb file. :)
What surprises me is that there seems to be no other way than hacking (cutting, grepping, etc.) each package separately. I wonder how this is handled in machines that use a lot of MISC packages (other than pulling+building every time to automatically have the latest version)?
Also, kudos on the acronym :)
I eventually got tired of writing that manually, so I wrote a small
that works just like 'git commit', except it adds another line to the commit message template.Placing it in e.g. ~/bin/git-co-commit and having ~/bin in your $PATH will enable it as a git sub-command.
I've never had a use for this before, and I don't think I'll need it much beyond this team, but this was my first git sub-command that wasn't trivially solvable by existing command parameters (that I know of).
https://gist.github.com/sshine/d5a2986a6fc377b440bc8aa096037...
https://blog.sebastian-daschner.com/entries/custom-git-subco...
Many "built-in" Git commands are themselves separate executables. My Linux machine has them in `/usr/lib/git-core/`, and my macOS machine has them in `/Applications/Xcode.app/Contents/Developer/usr/libexec/git-core/`.
Many Linux tools do this. Rust's cargo is another.
In fact, `man` itself can take multiple positional arguments and will concatenate them with a hyphen to perform the lookup.
You configure a ~/.git-authors file with people with whom you regularly pair, and use `git duet [author-1] [author-2]` to set primary and secondary commit authors. Env variables set whether you want `Signed-off-by` or `Co-authored-by` trailers.
There seems to be at least these three advantages over my approach:
I might consider switching for the next small project. :-)My .gitconfig has:
Now `git co-authored-by Tom` generates a Co-authored-by: trailer for the last person named Tom who committed to the repo. Typically I'd just do `:r !git co-authored-by Name` in vim (mapped to \gc to save typing).To the OP: You might be able to simplify the script by using `git commit --trailer …`. Or maybe you tried that and it didn't display the message in the editor window satisfactorily?
For some reason `--trailer` is not available on my system, so I'd need to upgrade git, it seems.
Also a bookmarklet that you use to turn any page dark with a click:
From here: https://github.com/x08d/222- have something mildly repetitive to do, notice how to do parts of it with keyboard only.
- right click AutoHotkey icon in taskbar, edit (opens Notepad).
- change some of these automations, alt+1, alt+2, alt+3, ...
- press ctrl+s to save and reload.
- switch back to the program and use the hotkey immediately to begin helping.
- repeat switching to AutoHotkey and the program, tweaking and adding more.
It's amenable to the kind of occasional task which has no easy proper automation, or is a one-off and isn't worth more time to do it through proper interfaces. Things like a vendor support telling you to "go through every affected record and toggle X field off and on again".
* entr(1) - http://eradman.com/entrproject/
* Watchexec - https://watchexec.github.io/
[0]https://github.com/cortesi/modd
# wait on a path and do something on change, e.g. `wait_do test/ run_tests.sh`
# keep running a command until successful (i.e. zero return code), e.g. `nevergonnagiveyouup rsync ~/folder/ remote:~/folder/`So I use this script to give me a nice work environment, based on each day.
Every time you open bash, it'll drop you into today's directory. (~/work/year/month/day/)
When I think about stuff it's like.. oh yeah I worked on that last week, last year, etc - the folder structure makes this a lot easier, and you can just write 'notes' or 'meeting-with-joe' and you know the ref date.
For your bashrc:
Now every day you'll know what you worked on yesterday! additionally you'll get a shortcut, you can type 't' as a bash fn, or go to ~/t/ which is symlinked and updated everytime you run today (which is everytime you open bash or hit 't'. this is useful if you want to have Firefox/Slack/whatever always save something in your 'today' folder.https://git.ceux.org/today.git/
Do you mean Nepomuk, that thing from KDE, which dead like 10+ years ago?
> Every time you open bash, it'll drop you into today's directory. (~/work/year/month/day/)
Ok, that's an interesting Idea, but isn't this more akin to Gnome Zeitgeist? When aimed Nepopmuk at journaling?
> https://git.ceux.org/today.git/
But where is the script? I only see the readme, no code for creating the folder.
Yeah. I mean I think the pitch when KDE4 launched was like... let's rethink how we deal with our files as less discrete paths and more like easily findable stuff.
> But where is the script? I only see the readme, no code for creating the folder.
whoops, I added it!
1. Preserve single newlines that people typed in: Often people hit Return only once, and their intended formatting becomes a wall of text. Hacker News preserves the newline in the HTML.
2. Vertical lines to visually represent thread depth!https://github.com/dom111/dotfiles/blob/master/bin/git
which when combined with files like:
means I don't accidentally leak email addresses etcAlso, not entirely related, but I wrote a small tool to add some animated GIFs to scripts: https://dom111.github.io/gif-to-ansi/
To use this alias, make an executable file called `git-recent` with the following contents and ensure it is in your `$PATH`
Here's a redacted example of the output looks likegit alias:
zsh function: But sometimes it screws up the terminal and I have to run `reset` to fix. Will be great if someone helps with that.I made a CLI utility for automating certain operations I was doing all the time: rsync of sources (push or pull), db backup / rollback, copying the local db to the remote server or back, etc. The utility looked for a dotfile in the project directory to get things like the remote server address, remote project path, etc.
The tool served several purposes:
- Executing auxiliary tools (rsync, mysqldump, drush) with the right parameters, without requiring me to remember them.
- Storing (non-secret) information about the remote environment(s) in the project directory.
- Some dangerous operations (e.g. copying the local db to the remote server) were prohibited unless the dotfile explicitly enabled them. Some sites were only edited on dev and then pushed to production, but some had user data that should never be overwritten.
- When running rsync of sources the tool always did a dry-run first, and then required entering a randomly generated 4-letter code to execute them... so I would have to stop and think and didn't deploy by mistake.
This tool is too rough for sharing it with the general public... but I consider it one of my greatest professional achievements because it saved me a lot of mental effort and stress over the years, quite a bit of time, prevented me from shooting myself in the foot, and forced me to use proper workflows every time instead of winging it. It required a small investment of time and some foresight... but my philosophy is that my work should be to build tools to replace me, and that was a step in that direction.
I dream of doing freelance work (assuming this is what this was) and essentially automating 80% of the maintenance work. There's something just so neat about that to me.
(Lots of us don't use OMZ)
DRY is a good guideline but a rubbish rule.
https://github.com/NARKOZ/hacker-scripts
For example, I have one script that uses rumps to show how many outdated homebrew packages I have (and also as a convenient shortcut to update those packages in the dropdown menu). I also have a second script that uses it to show a counter for open pull requests that I need to review (with links to individual PRs in the dropdown menu). It's great!
Result looks like this: https://imgur.com/yy6GlYk.jpg
My examples: https://imgur.com/a/SrjG1xe
I basically do all my local management from there, no need to run scripts manually, no need to click around in Finder manually, I even added a command to quickly copy my email, simply because it is so low friction to do it.
Hands down the best tool I've ever used
function go() {
}setTimeout(go, 100);
There are several different plaintext accounting tools, but they all support automation like this. I personally use Beancount because I work best in Python.
The other huge advantage is that the "state" of your finances isn't opaque like in Xero. If you realize you've been categorizing certain transactions incorrectly in Xero, it's a hassle to navigate Xero's interface to correct everything, whereas in plaintext accounting it's usually a 2-second find/replace.
The downside is that there's a steep learning curve and the documentation is kind of overwhelming, but once you learn it, it's extremely valuable.
[0] https://plaintextaccounting.org/
BUT, this thread is so special because it feels like this is the stuff you only get to see when you sit down at a co-worker's desk and watch them type something and then say "WHAT? HOW COOL!"
I miss that part now that it is all remote work. :(
Charging: 32.15W
Discharging: -5.15W
I have in my shell history which I occasionally use:
(With ^[ being actual escape, entered via Ctrl-V Escape, because I find writing the escape codes literally easier and more consistent than using echo -e or whatever else.)It’ll show lines like this every minute (with nice colouring):
My PinePhone has an axp20x-battery with current_now and voltage_now, like your various laptops except that while discharging it gets a negative current_now, which makes perfect sense to me but which doesn’t seem to match your laptops (since you add the negative sign manually in your script) or my laptop’s power_now (which is likewise still positive while discharging).https://github.com/sirikon/workstation/blob/master/src/cli/c...
For Linux, it can install and configure everything I need when launched on a clean Debian installation. apt repositories, pins and packages; X11, i3, networking, terminal, symlinking configuration of many programs to Dropbox or the repository itself... The idea is to have my whole setup with a single command.
For Mac, it installs programs using brew and sets some configurations. Mac isn't my daily driver so the scripts aren't as complete.
Also there are scripts for the terminal to make my life easier. Random stuff like killing any gradle process in the background, upgrading programs that aren't packetized on APT, backing up savegames from my Anbernic, etc. https://github.com/sirikon/workstation/tree/master/src/shell
And more programs for common use, like screenshots, copying Stripe test cards into the clipboard, launching android emulators without opening Android Studio, etc. https://github.com/sirikon/workstation/tree/master/src/bin
Use it everyday. Great because my company has multiple git submodules in any given project and I can use this to watch for pipeline failures and the like.
[1]: https://github.com/paulirish/git-open
https://github.com/trevorgross/installarch/blob/main/install...
It's a personal tool that just kept growing. Probably amateurish by HN standards, but then, I'm an amateur. Yes, I could simply copy a disk image, but that's no fun.
One I was particularly proud of/disgusted by was one that allowed me to jump around a network with a single command despite access being gated by regional jumphosts..
You are warned: https://git.drk.sc/-/snippets/107
Another script I wrote for our devs to get access to MySQL in production on GCP; the intent was for the script to be executable only by root and allow sudo access to only this script: that means also ''chmod ugo-rwx gcloud'' too though: https://git.drk.sc/-/snippets/98
I have another script to generate screenshots from grafana dashboards since that functionality was removed from grafana itself (https://github.com/grafana/grafana/issues/18914): https://git.drk.sc/-/snippets/66
Another time I got annoyed that Wayland/Sway would relabel my screens on successive disconnect/reconnects (IE my right screen could be DP-1 or DP-7 or anything in between randomly); so I wrote a docking script which moves the screens to the right place based on serial number: https://git.drk.sc/-/snippets/74
I can relate! I think it just reflects the nature of the problem space. The script is gnarly because the thing one is trying to do is gnarly. Utility is the driving force, as far as I'm concerned.
The following aren't as gnarly as yours, but served their purpose nicely in that little project's context. I like to put/accumulate project-related automations in a `./bin` in my projects.
https://gitlab.com/nilenso/cats/-/tree/master/bin
https://www.shellcheck.net/
I have used these 2 on my machines for the last 4 years and writing tons of script for myself, here are a few:
- Displaying internet/internal ip and allow me to click it to put in clipboard
- taskwarrior
- Simple conversion script that take my clipboard & encode/decode in base64, hex, url encoding, convert epoch to UTC,
- "auto type" my clipboard by simulating keystrokes- particular useful for pasting text into terminal that disable clipboard
- An incident response switch that would trigger a script to take screenshot every 5 seconds when my mouse moves, reduce image quality and save it to a folder in my homedrive. Another script will GPG encrypt it at the end of the day so i can go back and get screenshot or look back at incident if needed.
After that, it's just the matter of putting a crontab job to run archive job every night. Note that i have no way yet to know when the mouse move in macosx as xdotool no longer work with mac so right now it takes screenshot of every monitor and resize it down... it might be too much and could eat up your HDD. i like the nix version since I did a dirty job with mouse location so whenever i take a break from incident or walk away from my desk, the screenshot script stops.
https://github.com/Mister-Meeseeks/subcmd/blob/master/subcmd
It can be configured to exclude certain directories (.cache and Downloads being likely contenders). Also, it can read in config files so it can backup other directories.
https://github.com/lordfeck/feckback
Then I wrote a script that does that automatically:
If I recall correctly, I think there were some situations where --fix-broken did nothing for me, but the script did. I don't remember it nearly well enough to guarantee, though.
One difference I'm sure about is that the script marks all recursively installed dependencies as manually installed, which may not be favorable (e.g. if you wanted to remove the top-level package, all the dependencies would not be removed automatically).
#!/usr/bin/env bash
function shc() { #: cat for shell scripts, source code. #: prints text with line numbers and syntax highlighting. #: accepts input as argument or pipe.
---------
#!/bin/sh
echo "Store and retrieve session token AWS STS \n\n"
# Get source profile read -p "Source Profile [<profile_name>]: " source_profile source_profile=${source_profile:-'<profile_name>'} echo $source_profile
# Get destination profile read -p "Destination Profile [<profile_name>-mfa]: " destination_profile destination_profile=${destination_profile:-'<profile_name>-mfa'} echo $destination_profile
mfa_serial_number='arn:aws:iam::<id>:mfa/<name>'
echo "\nOTP: " read -p "One Time Password (OTP): " otp
echo "\nOTP:" $otp echo "\n"
output=$(aws sts get-session-token --profile <profile_name> --serial-number $mfa_serial_number --output json --token-code $otp)
echo $output
access_key_id=$(echo $output | jq .Credentials.AccessKeyId | tr -d '"') secret_access_key=$(echo $output | jq .Credentials.SecretAccessKey | tr -d '"') session_token=$(echo $output | jq .Credentials.SessionToken | tr -d '"')
aws configure set aws_access_key_id $access_key_id --profile=$destination_profile aws configure set aws_secret_access_key $secret_access_key --profile=$destination_profile aws configure set aws_session_token $session_token --profile=$destination_profile
echo "Configured AWS for profile" $destination_profile
The reason I like it is also backs up the original in case I mess up the regex (happens sometimes...)
This can be extended easily, even dynamically creating an account if the user is part of an org, or use libnss-ato to alias the user to a specific account.
I have a pair of fixup commit functions, which make it faster to target fixup commits prior to rebasing:
Long function names that are then assigned to an alias can make it easier to find them later if you forget rarely used ones. That is you can do:$ alias | grep fixup
to see the list of relevant aliases and the functions they call.
I also have two functions I use like a linear git bisect:
> bc
search_notes() { input=$(rg -v '(\-\-)|(^\s*$)' --line-number /home/user/some-dir | fzf --ansi --delimiter : --preview 'batcat --color=always {1} --highlight-line {2}' --preview-window 'up,60%,border-bottom,+{2}+3/3,~3' | choose -f : 0) if [[$input = ""]]; then else less $input fi }
It uses various linux utilities including fzf and batcat(https://github.com/sharkdp/bat) to open a terminal with all the places where my query comes up (supporting fuzzy search). Since the workhorses are fzf and ripgrep its is quite fast even for very large directories.
So i will do `search_notes postgres authentication`. I can select a line and it will open the file in less. Works like a charm!
1: https://gist.github.com/adewes/02e8a1f662d100a7ed80627801d0a...
The most recent was a script that parsed a financial report and generated multiple emails depending on a set of criteria. Then the user could manual review these emails and press send if everything checks out. The goal of the script was to reduce some of the menial work my financial co-worker was doing. I don't have it published on GitHub because it has some internal company info in it. But it worked cleanly, and regularly saves him hours of tedious work.
Also I highly recommend EasyGui library for those quick scripts that need user input from people who are not comfortable with a console/cmd. Helps make different types of popup windows for user input/selection with a few simple lines.
https://github.com/64kramsystem/openscripts
Missed the previous cheatsheet post :) I have a massive collection, which are large enough to be books more than cheatsheets (still, I access and use them as cheatsheets):
https://github.com/64kramsystem/personal_notes/tree/master/t...
[1] https://gitlab.com/victor-engmark/tilde/-/blob/master/.bash_...
https://beyondgrep.com/
My "Bash Toolkit": https://github.com/adityaathalye/bash-toolkit
My (yak-shaving-in-progress :) "little hot-reloadin' static shite generator from shell": https://github.com/adityaathalye/shite
A subtle-ish aspect is, I like to write Functional Programming style Bash. I've been blogging about it here: https://www.evalapply.org/tags/bash/
Oh man. I didn't know just how badly I needed this in my life, thank you!
It also uses my https://github.com/pcho/dotfiles, https://github.com/pcho/vimfiles and https://github.com/pcho/zshfiles
With my son having opened an account over a year ago, but we didn’t sign up for Mint until this weekend, I ended up writing a new import script for the updated API:
https://github.com/jeradrose/mint-simple-import
- Scripts to test our rate limiting for both authenticated and unauthenticated users (was handy)
- API routes changed in a given PR (set of commits since the last interaction with master in reality)
- ssl-expiration-date - Checks the expiration date of a site's certificate
- test-tls-version - Checks if a website supports a given version of TLS There are also some miscellaneous PHP scripts lying around for template related stuff. PHP makes a create templating language when you need some basic programmatic additions to your output text.Everything is too coupled to my work to be useful to others, and most of the automation scripts I've written for work are run as cron jobs now and send out emails to the appropriate emails. Most of these are written in PHP (we're a PHP shop).
It shows the current status, lists out the most recent tags, prompts for a new tag and message, and finally pushes.
Everything is colorized so it's easy to read and I use it quite often for Golang projects.
https://github.com/bbkane/dotfiles/blob/e30c12c11a61ccc758f7...
[1] https://github.com/kaelzhang/shell-safe-rm
There are many ways to search for the process, but here's what I use:
Look for port num and kill the process with: Note if running as root user, you will need to prepend the above commands with sudoAs a workaround, I wrote a small wrapper script that would enable multi-threading for SimpleHTTPServer.
~/bin/http-cwd , Python 2 version (original):
Python 3 version (necessary for platforms that have dropped Python 2, such as macOS):It fixes an issue in the Python built-in HTTP server that causes it to hang under concurrent connections.
First, run the built-in Python web server:
Then connect to it with a client that doesn't immediately send a request, such as `netcat`. This simulates the behavior of modern browsers, which seem to set up a pool of pre-established connections. Now try to get a page from the server via Curl (or wget, etc). It will hang after sending the request, because the server's single thread is trying to serve the idle connection. In real life, the behavior I saw was that I'd try to connect to the server with Chrome and it would hang after the pages had partially loaded.This issue appears to have been fixed: https://github.com/python/cpython/issues/75820
There is also a collection of more "obscure" scripts in my shellscripts repository documented here: https://masysma.lima-city.de/32/shellscripts.xhtml.
Another (probably niché) topic is my handling of scanned documents which arrive as PDFs from the scanner and that I want to number according to the stamped number on the document and convert to png at reduced color space: https://masysma.lima-city.de/32/scanning.xhtml
8<-----------------------------
8<-----------------------------`munge_pwd` is another script that does various substitutions on the prompt (specific to how my work directories are laid out) but mostly you can just substitute `pwd` if you don't care about deduplicating stuff like multiple checkouts of the same project.
https://github.com/djsamseng/cheat_sheet/blob/main/grep_for_...
#!/bin/bash
if [ $# -eq 0 ] then echo "Usage: ./grep_for_text.sh \"text to find\" /path/to/folder --include=*.{cpp,h}" exit fi
text=$1 location=$2
# Remove $1 and $2 to pass remaining arguments as $@ shift shift
result=$(grep -Ril "$text" "$location" \ $@ \ --exclude-dir=node_modules --exclude-dir=build --exclude-dir=env --exclude-dir=lib \ --exclude-dir=.data --exclude-dir=.git --exclude-dir=data --exclude-dir=include \ --exclude-dir=__pycache__ --exclude-dir=.cache --exclude-dir=docs \ --exclude-dir=share --exclude-dir=odas --exclude-dir=dependencies \ --exclude-dir=assets)
echo "$result"
https://github.com/BurntSushi/ripgrep
As a sibling comment mentioned, assuming you're .gitignore files exclude all of that stuff from your repo, you should be able to just run 'rg "text to find"' to replace all of that. And use 'rg "text to find" -tcpp' if you want to limit it to C++ files.
I had similar scripts for recursive grep like that too. ripgrep replaced all of them.
https://gist.github.com/stuporglue/83714cdfa0e4b4401cb6
It's one of my favorites because it's pretty simple, and I wrote it when a lot of things were finally coming together for me (including GIS concepts, plpgsql programming, and a project I was working on at the time).
This is code which takes either two foci points and a distance, or two foci, a distance and the number of pointers per quadrant and generates a polygon representing an ellipse. Nothing fancy, but it made me happy when I finally got it working.
The use case was to calculate a naive estimate of how far someone could have ridden on a bike share bike. I had the locations they checked out the bike, and where they returned it, and the time they were gone. By assuming some average speed, I could make an ellipse where everywhere within the ellipse could have been reached during the bike rental.
And then I have different dotfile repos. I have a base one that I keep so clean I could get a job at Disney with it. That's where most of my scripts live. And then I have locale ones, like -home, -<employername>. Those have overlays so that I can have contextual extensions, such as a cheat database with work stuff. Also, I can keep that dotfile-employername hosted at my employer so that I'm not "crossing the streams". I don't even have to link them, they just autoload based on their location and name.
I don't have to hop systems too much, so grabbing fresh tooling is a twice a year problem. I'm a cli-as-ide dinosaur so I just hide all my seldom-used scripts under a double underscore prefix. __init_tooling will update vim and give me the 8 or 9 plugins I have grown dependent upon, give me a ruby and python environment, etc.
I have a function called "add_word". Every time I see a word I dont know, I learn it, and then I run "add_word <new word> <definition>". It creates a new file called <new word> with the definition and commits it to a git repo hidden away. Every couple years I'll work through the list and see which I remember. I have about a 30% success rate adopting new words, which again, dinosaur here, so, I'll take whatever I can get.
The dirtiest thing I have is a cheap vault that uses vim and shell automation. I have a grammar for descripting secrets, and I can pass a passphrase through automation to get secrets out. I'm sure it's 100% hackable. I know the first rule of security software is "dont ever try to make your own". So I don't put anything too good in there.
Desktop: https://apps.ankiweb.net/
Android: https://play.google.com/store/apps/details?id=com.ichi2.anki...
Long story short: you can use hard links + rsync to create delta snapshots of a directory tree. I use it to create a back up of my important directory trees.
Funny story about this: I had really old HP "Lance Armstrong" branded laptop that I used for years. The above above script was on it and was rsyincing to separate machine so it was fully backed up. Because of that, I was actually hoping for the laptop to die so I could get a new one (frugalness kicking in strong here).
My girlfriend at the time was using it and said "Oh, should I not eat or drink over your laptop?" and I responded: "No, please do! If you break it that means I can allow myself to order a new one."
alias please='sudo zsh -c "$(fc -ln -1)"' # rerun the last command with sudo (because it failed )
Easier PATH management:
# nicer path configuration and lookup function path { if [[ $# -eq 0 ]]; then echo -e ${PATH//:/\\n} | sort elif [[ "$1" == "--save" ]]; then path $2 && echo "\npath $2" >> $HOME/.profile else if [[ -d "$1" ]] ; then if [[ -z "$PATH" ]] ; then export PATH=$1 else export PATH=$1:$PATH fi else echo "$1 does not exist :(" return 1 fi fi }
This is a bash one-liner that takes the place of an RSA/2FA token/AuthyApp
https://manpages.ubuntu.com/manpages/bionic/man1/oathtool.1....
Variables: $HOSTNAME - the computer hostname $TOBACKUPDIR - the local directory you want backed up $N_CORES - the number of cores you want to use for compression $REMOTEUSER - the ssh user login on the remote server $REMOTEHOST - the remote server's IP $BACKUPDIR - where you want the file to be backed up to
#!/bin/bash
bfile=`date +%F`.$HOSTNAME.tar.gz
#!/usr/bin/env bash
function cdf() { #: Change working directory to the top-most Finder window location cd "$(osascript -e 'tell app "Finder" to POSIX path of (insertion location as alias)')"; }
https://www.masteringemacs.org/article/fuzzy-finding-emacs-i...
Sorry, no Linux version as I rarely have a graphical desktop open on Linux. It should be easy to rig something up with Ghostscript or similar.
Thanks for this tip!
This will fork it and wait until it ends.
#!/bin/bash
TOTAL=`ps aux | grep YOUR_CRONJOB.php | grep -v grep | wc -l`
echo "TOTAL PROCESSES ALREADY RUNNING :"$TOTAL
MAX_THREADS=20
TOTAL_MODS="$(($MAX_THREADS-1))"
echo "TOTAL MODS: "$TOTAL_MODS
if [ $TOTAL -eq 0 ]
then
else fifor pid in ${pids[*]}; do
doneecho "OK FINISHED"
https://github.com/noahbailey/kvmgr/blob/master/kvmgr.sh
[1]: https://github.com/axelf4/nixos-config/blob/e90e897243e1d135...
https://github.com/cednore/dotfiles/blob/master/.functions#L... https://github.com/cednore/dotfiles/blob/master/.aliases#L46...
Use gnuplot to plot one or more files directly from the command line: https://github.com/RhysU/gplot/blob/master/gplot
When I read it today I miss those soo oversimplified solutions to do stuff :'-)
Here are some selected scripts folks might find interesting.
Here's my backup script that I use to encrypt my data at rest before shipping it off to s3. Runs every night and is idempotent. I use s3 lifecycle rules to keep data around for 6 months after it's deleted. That way, if my script goofs, I can recover: https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae...
I have so many machines running Archlinux that I wrote my own little helper for installing Arch that configures the machine in the way I expect: https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae...
A tiny little script to recover the git commit message you spent 10 minutes writing, but "lost" because something caused the actual commit to fail (like a gpg error): https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae...
A script that produces a GitHub permalink from just a file path and some optional file numbers. Pass --clip to put it on your clipboard: https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae... --- I use it with this vimscript function to quickly generate permalinks from my editor: https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae...
A wrapper around 'gh' (previously: 'hub') that lets you run 'hub-rollup pr-number' and it will automatically rebase that PR into your current branch. This is useful for creating one big "rollup" branch of a bunch of PRs. It is idempotent. https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae...
Scale a video without having to memorize ffmpeg's crazy CLI syntax: https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae...
Under X11, copy something to your clipboard using the best tool available: https://github.com/BurntSushi/dotfiles/blob/2f58eedf3b7f7dae...
[1] https://github.com/benwinding/dotfiles
[2] https://zachholman.com/2010/08/dotfiles-are-meant-to-be-fork...
https://github.com/ianmiell/bash-template
It's a 'cut and paste' starter for shell scripts that tries to be as robust as possible while not going crazy with the scaffolding. Useful for "I want to quickly cut a script and put it into our source but don't want it to look totally hacky" situations.
https://rtnf.prose.sh/prose-sublime-text-integration
https://rtnf.prose.sh/pandoc-sublime-text-integration
Jesus that foot-gun would make Bjarne Stroustrup blush.
zfsnapr, a ZFS recursive snapshot mounter - I run borg-backup.sh using this to make consistent backups: https://github.com/Freaky/zfsnapr
mkjail, an automatic minimal FreeBSD chroot environment builder: https://github.com/Freaky/mkjail
run-one, a clone of the Ubuntu scripts of the same name, which provides a slightly friendlier alternative to running commands with flock/lockf: https://github.com/Freaky/run-one
ioztat, a Python script that basically provides what zfs-iostat(8) would if it existed: https://github.com/jimsalterjrs/ioztat
I replaced Plone for my personal use with about 1000 lines of Python. A object oriented database. The interface is awkward but if you get past that the goal was to produce pictures of trees with graphviz.
https://github.com/maulware/maulstuff
So I dropped the batch above on my desktop, and click it while the VPN is starting up. In the 4 seconds it takes to kill everything, the network works as it should.
#!/bin/bash echo |\ openssl s_client -connect ${1:?Usage: $0 HOSTNAME [PORT] [x509 OPTIONS]}:${2:-443} 2>&1 |\ sed -ne '/-BEGIN CERTIFICATE-/,/-END CERTIFICATE-/p' |\ openssl x509 ${3:--text} ${@:4} 2>/dev/null |\ sed '/-BEGIN CERTIFICATE-/,/-END CERTIFICATE-/d'
https://github.com/ericfitz/dominfo
Dependencies: sublist3r (Python) pv (used for progress bars)
The junk I haven't touched in 10 years: https://github.com/psypete/public-bin/src
# M1 compatibility switches
arm() { arch -arm64 "${@:-$SHELL}" }
x86() { arch -x86_64 "${@:-$SHELL}" }
This with the addition of `$(uname -m)` in my $PROMPT, has saved me a lot of time by letting me switch between arm and x86_64 architecture.
c:\Temp>tbt tbt @echo off
type c:\tools\%1.bat
c:\Temp>
Photoshop Layer Labeler: https://www.middleendian.com/pslayerlabeler
loop() { NUM=$1 shift for i in {1..$NUM}; do "$@" done }
http://angg.twu.net/eepitch.html
that lets me execute my scripts line by line very easily.
I wanted to control my display’s brightness using my keyboard on Linux. Turned out to be pretty easy with ddcutil!
https://github.com/jcuenod/zotero-backup-scripts/
https://github.com/j1elo/shell-snippets
[ -z "$PS1" ] && return function cd { builtin cd "$@" && ls }
https://github.com/mrichtarsky/linux-shared
The repo name is a bit outdated, it works on macOS too. Lots of scripts are missing, will add them soon.
Will definitely be adding more as I tidy them up! :)
https://jezenthomas.com/showing-the-weather-in-tmux/
This one will generate any kind of TLS certificate: Root CA, intermediate, mail, web, client-side …
https://github.com/egberts/tls-ca-manage
This script automatically stages everything and commits it as "WIP". If it detects that the most recent commit was a "WIP" then it amends the previous commit. No more weird stashing just to avoid losing my place
https://github.com/web3cryptowallet/drive-py
https://github.com/gitalias/gitalias
> up
Does a `cd ..` on every keypress except ESC or space.
> up $n
Does a total of $n `cd ..` and (important!) set OLDPWD to the initial directory for proper `cd -`.
Sets up an Ubuntu server as a strongSwan IKEv2 VPN.
https://github.com/helpermethod/mr
You can also set Git push options understood by the GitLab server to create merge requests [0] on the CLI.
Sid's dotfiles provide an example in [1]. The workflow is 1) push 2) create merge request 3) set target (master/main) 4) merge when the pipeline succeeds.
alias mwps='git push -u origin -o merge_request.create -o merge_request.target=main -o merge_request.merge_when_pipeline_succeeds' # mwps NAME_OF_BRANCH
There are more push options, such as setting the MR as draft, add labels, milestones, assignees, etc. My personal favorite: Remove the source branch when the MR is merged. That's a project setting too, but sometimes not set. Using the push options, you can force this behavior and avoid stale Git branches.
glab as CLI tool provides a similar functionality to create an MR. Its development has been moved to this project [2]
[0] https://docs.gitlab.com/ee/user/project/push_options.html#pu...
[1] https://gitlab.com/sytses/dotfiles/-/blob/master/git/aliases...
[2] https://gitlab.com/gitlab-org/cli
https://avestura.dev/snippets
Mine are here:
- https://github.com/vermaden/scripts
Never promoted it but I’ve been quietly using it myself to build stuff that I need. Obviously browser based stuff have limitations but I found I still get a lot done
What a descriptive name :D
#!/usr/bin/env bash
function szup() {
description=' #: Title: szup #: Synopsis: sort all items within a directory according to size #: Date: 2016-05-30 #: Version: 0.0.5 #: Options: -h | --help: print short usage info #: : -v | --version: print version number '
funcname=$(echo "$description" | grep '^#: Title: ' | sed 's/#: Title: //g') version=$(echo "$description" | grep '^#: Version: ' | sed 's/#: Version: //g') updated="$(echo "$description" | grep '^#: Date: ' | sed 's/#: Date: //g')"
$ szup . 42
When you download a video from certain sites, ctime is the time you created the file (so the time you downloaded) but the video still comes with a timestamp which is saved as the mtime (I'm not sure why this happens, maybe there's a http header for that?), and I presume it's the time when the video was first uploaded to the site?
Here's a favorite of mine: all my scripts' -h simply show the source code
do you mean when using youtube-dl/yt-dlp ? they have an option: --no-mtime
And yes the videos were downloaded with youtube-dl
edit: ohhh it uses the Last-modified HTTP header to set the mtime!! cool! https://unix.stackexchange.com/questions/387132/youtube-dl-a...
My downloads do get the upload date written into the video's metadata so I have that if I ever need it.
How do you do it
https://github.com/dlight/dotemacs/blob/master/bindings.el#L...
This is brittle-looking but I didn't know how to test if a buffer is special other than checking the buffer name (I'm sure there's a better way). Anyway I kept looking if emacs already had this, but apparently not
Maybe someone will think it is useful so:
In .zshrc I have the least possible things so it opens fast. But I include the commands that would extend (srcBlah) or help me tune how to extend (editBlah):
For my pet projects I'd use these two:
As you can see, editSet opens VSCode and src is sourcing it in the current terminal.oneline:
Puts anything you give in its standard input in one line.L, my journaling tool (whenever I need to get something out of my head or be sure to find it later); I can edit and fix stuff by editing the file it generates after the fact: