Every day pretty much with Unix tools. Vim, awk, sed, etc.
Usually many times a day… Even today which have been mostly meetings.
Yesterday. Gotta grep those logs.
Today.
At least once every few days while coding, usually to do one of the following:
-
Select multiple things in the same file at the same time without needing to click all over the place
Normally I use multicursor keyboard shortcuts to select what I want and for the trickier scenarios there are also commands to go through selections one at a time so you can skip certain matches to end up with only what you want.
But sometimes there are too many false matches that you don’t want to select by hand and that’s where regex comes in handy.
For instance, finding:
- parent but not apparent, transparent, parentheses, apparently, transparently
- test but not latest, fastest, testing, greatest, shortest
- trie but not entries, retries, countries, retrieve
- http but not https
… which can be easily done by searching for a word that doesn’t include a letter immediately before or immediately after: e.g.
\Wtest\W
. -
Search for things across all files that come back with too many results that aren’t relevant
Basically using the same things above.
-
Finding something I already know makes a pattern. Like finding all years:
\d{4}
, finding all versions:\d+\.\d+\.\d+
, finding random things that a linter may have missed such as two empty lines touching each other:\n\s*\n\s*\n
, etc…
-
Yesterday, for capturing URLs.
https?//[a-zA-Z0-9_-]*
I am kinda learning RE right now 😅
What about ftp? 🤔
If we want to include every protocol then the RE could be complex.
Depending on the use-case it maybe should. On the other hand, some things are better left to library implementations rather than custom regex, e.g. email validation
This sentence is the uncanny valley for structure.
A few hours ago.
I just wanted to make a list of AD group names into a powershell array.
Yesterday, when I had a file with a list of JSON objects, and I wanted to move the date field at the end to the beginning, so I used regex find and replace to move it. Something like
\{(.*?), ("date": ".*?")
in Search, and then{$2, $1
in replace (or something close to it).Yes, I refactor code and data using regex. I can’t be arsed to learn AWK (even though I should).
AWK doesn’t work with json IIRC. You have to use jq to deal with json.
While yes, the way I had it structured looked like a CSV if you squinted a little, I do fully agree AWK can’t be used for just any old JSON.
jq
is dope, but that language still feels pretty confusing IMO.
On average I’ve probably had to work with them or write one from scratch only a handful of times per year over my career. Not often enough to be an expert or anything but I’m not so afraid of them as I used to be.
I don’t always use regular expressions, but when I do, I use it to parse XML,
iirc using RE to parse tag languages is not recommended.
Sure, but if you are not regularly expressing code that has the potential of summoning elder gods that will swallow your soul into a dimension of ceaseless screaming then are you really living?
Asking this question is like asking when was the last time you had to search through text.
Writing the script that got me fired
Please explain more! What happened?
Did you destroy a database? Expose credentials? Nuke the company intentionally?
I hope you are joking
Interesting to see a lot of these responses (so far) are workflow related instead of being used in production.
Probably, because in production there are really few things that are best done with regex. Most use I had for regex in production is filling in data from user-provided files with specifically crafted names, and even there there was some guesswork because of errors in naming, and the same thing may have been achieved without regex by splitting and/or iterating
Yesterday doing a search using vim for a class that shared a lot of characters at the front with many other classes: /Bas.*Some I could have done a more precise search with better regex, but this was quick, easy, and worked.