Tech workers - what did your IT Security team do that made your life hell and had no practical benefit?
One chestnut from my history in lottery game development:
While our security staff was incredibly tight and did a generally good job, oftentimes levels of paranoia were off the charts.
Once they went around hot gluing shut all of the "unnecessary" USB ports in our PCs under the premise of mitigating data theft via thumb drive, while ignoring that we were all Internet-connected and VPNs are a thing, also that every machine had a RW optical drive.
Banned open source software because of security concerns.
For password management they require LastPass or that we write them down in a book that we keep on ourselves at all times.
Worth noting that this policy change was a few months ago. After the giant breach.
Took away Admin rights, so everytime you wanted to install something or do something in general that requires higher privileges, we had to file a ticket in the helpdesk to get 10 minutes of Admin rights.
The review of your request took sometimes up 3 days. Fun times for a software developer.
Formerly, I was on the Major Incident Response team for a national insurance company. IT Security has always been in their own ivory tower in every company I've worked for. But this company IT Security department was about the worst case I've ever seen up until that time and since.
They refused to file changes, or discuss any type of change control with the rest of IT. I get that Change Management is a bitch for the most of IT, but if you want to avoid major outages, file a fucking Change record and follow the approval process. The security directors would get some hair brained idea in a meeting in the morning and assign one of their barely competent techs to implement it that afternoon. They'd bring down what ever system they were fucking with. Then my team had to spend hours, usually after business hours, figuring out why a system, which had not seen a change control in two weeks, suddenly stopped working. Would security send someone to the MI meeting? Of course not. What would happen is, we would call the IT Security response team and ask if anything changed on their end. Suddenly 20 minutes later everything was back up and running. With the MI team not doing anything. We would try to talk to security and ask what they changed. They answered "nothing" every god damn time.
They got their asses handed to them when they brought down a billing system which brought in over $10 Billion (yes with a "B") a year and people could not pay their bills. That outage went straight to the CIO and even the CEO sat in on that call. All of the sudden there was a hard change freeze for a month and security was required to file changes in the common IT record system, which was ServiceNow at the time.
We went from 150 major outages (defined as having financial, or reputation impact to the company) in a single month to 4 or 5.
Fuck IT Security. It's a very important part of of every IT Department, but it is almost always filled with the most narcissistic incompetent asshats of the entire industry.
Set the automatic timeout for admin accounts to 15 minutes....meaning that process that may take an hour or so you have to wiggle the mouse or it logs out ..not locks.... logs out
From installs to copying log files, to moving data to reassigning owner of data to the service account.
Not my IT department (I am my IT department): One of the manufacturers for a brand of equipment we sell has a "Dealer Resource Center," which consists solely of a web page where you can download the official product photography and user's manuals, etc. for their products. This is to enable you to list their products on your e-commerce web site, or whatever.
Apparently whoever they subcontracted this to got their hands on a copy of Front End Dev For Dummies, and in order to use this you must create a mandatory account with minimum password complexity requirements, and solve a CAPTCHA every time you log in. They also require you to change your password every 60 days, and if you don't they lock your account and you have to call their tech support.
Three major problems with this:
There is no verification check that you are actually an authorized dealer of this brand of product, so any fool who finds this on Google and comes up with an email address can just create an account and away you go downloading whatever you want. If you've been locked out of your account and don't feel like picking up the telephone -- no problem! Just create a new one.
There is no personalized content on this service. Everyone sees the same content, and it's not like there's a way to purchase anything on here or anyway, and your "account" stores no identifying information about you or your dealership that you feel like giving it other than your email address. You are free to fill it out with a fake name if you like; no one checks. You could create an account using obvioushacker@pwned.ru and no one would notice.
Every single scrap of content on this site is identical to the images and .pdf downloads already available on the manufacturer's public web site. There is no privileged or secure content hosted in this "Resource Center" whatsoever. The pictures aren't higher res or anything. Even the file names are the same. It's obviously hooked up to the same backend as the manufacturer's public web site. So if there were such a thing as a "bad actor" who wanted to obtain a complete library of glamor shots of durable goods, for some reason, there's nothing stopping them from scraping the public web site and coming up with literally exactly the same thing.
I had to run experiments that generate a lot of data (think hundreds of megabytes per minute). Our laptops had very little internal storage. I wasn't allowed to use an external drive, or my own NAS, or the company share - instead they said "can't you just delete the older experiments?"... Sure, why would I need the experiment data I'm generating? Might as well /dev/null it!
Mozilla products banned by IT because they had a vulnerability in a pervious version.
Rant
It was so bullshit. I had Mozilla Firefox 115.1 installed, and Mozilla put out an advisory, like they do all the fucking time. Fujitsu made it out to be some huge huge unfixed bug the very next day in an email after the advisory was posted and the email chain basically said "yk, we should just remove all Firefox. It's vulnerable so it must be removed."
I wouldn't be mad if they decided that they didn't want to have it be a managed app or that there was something (actually) wrong with it or literally anything else than the fact that they didn't bother actually reading either fucking advisory and decided to nuke something I use daily.
Removed admin access for all developers without warning and without a means for us to install software.
We got access back in the form of a secondary admin account a few days later, it was just annoying until then.
ZScaler. It's supposedly a security tool meant to keep me from going to bad websites. The problem is that I'm a developer and the "bad website" definition is overly broad.
For example, they've been threatening to block PHP.Net for being malicious in some way. (They refuse to say how.) Now, I know a lot of people like to joke about PHP, but if you need to develop with it, PHP.Net is a great resource to see what function does what. They're planning on blocking the reference part as well as the software downloads.
I've also been learning Spring Boot for development as it's our standard tool. Except, I can't build a new application. Why not? Doing so requires VSCode downloading some resources and - you guessed it - ZScaler blocks this!
They've "increased security" so much that I can't do my job unless ZScaler is temporarily disabled.
We cant run scripts on our work laptop because of domain policy. Thing is, I am a software developer. They also do not allow docker without some heavy approval process, nor VMs. So im just sitting here remoting into a machine for development...which is fine but the machine is super slow. Also their VPN keeps going down, so all the software developers have to reconnect periodically all at the same time.
At my prior jobs, it was all open so it was very easy to install the tools we needed or get approval fairly quickly. Its more frustrating than anything. At least they give us software development work marked months out.
Worked a job where I had to be a Linux admin for a variety of VMs. To access them, I needed an VPN that only worked inside the company LAN, and blocked internet access. it was a 30 day trial license on day 700somthing, so it had a max 5 simultaneous connection limit. Access was from my heavily locked down laptop. Windows 7 with 5 minutes locking Screensaver. The ssh software was an unknown brand, "ssh.exe" which only allowed one connection at a time in a 80 x 24 console window with no ability to copy and paste. This went to a bastion host, an HPUx box on an old csh shell with no write access to your home directory due to a 1.4mb disk quota per user. Only one login per user, ten login max, and the bastion host was the only way to connect to the Linux VMs. Default 5 minute logout for inactivity. No ssh keys allowed. No scripting allowed, was like typing over 9600 baud.
I quit that job. When asked why, I told them I was a Linux administrator and the job was not allowing me to administrate. I was told "a poor carpenter always blames his tools." Yeah, fuck you.
Here in Portugal the IT guys at the National Health Service recently blocked access to the Medical Doctor's Union website from inside the national health service intranet.
The doctors are currently refusing to work any more overtime than the annual mandatory maximum of 150h so there are all sorts of problems in the national health service at the moment, mainly with hospitals having to close down emergency services to walk-in patients (this being AskLemmy, I'll refrain from diving into the politics of it) so the whole things smells of something more than a mere mistake.
Anyways, this has got to be one of the dumbest abuses of firewalling "dangerous" websites I've seen in a long while.
They set zscaler so that if I don't access an internal service for an unknown number of months, it means I don't need it "for my daily work", so they block it. If I want to access it again I need to open a ticket. There is no way to know what they closed and when they'll close something.
In 1 months since this policy is active, I already have opened tickets to access test databases, k8s control plane, quality control dashboards, tableau server...
two separate Okta instances. It was a coin toss as to which one you'd need for any given service
oh, and a third internally developed federated login service for other stuff
90 day expiry for all of the above passwords
two different corporate IM systems, again coin toss depending on what team you're working with
nannyware everywhere. Open Performance Monitor and watch network activity spike anytime you move your mouse or hit a key
an internally developed secure document system used by an international division that we were instructed to never ever use. We were told by IT that it "does something to the PC at a hardware level if you install the reader and open a document" which would cause a PC to be banned from the network until we get it replaced. Sounds hyperbolic, but plausible given the rest of the mess.
required a mobile authenticator app for some of the above services, yet the company expected that us grunts use our personal devices for this purpose.
all of the above and more, yet we were encouraged to use any cloud hosted password manager of our choosing.
We have a largeish number of systems that IT declared catheorically could not connect directly to the Internet for any reason.
So guess what systems weren't getting updates. Also guess what systems got overwhelmed by ransomware that hit what would have been a patched vulnerability, that came through someone's laptop that was allowed to connect to the Internet.
My department was fine, because we broke the rules to get updates.
So did network team admit the flaw in their strategy? No, they declared a USB key must have been the culprit and they literally went into every room and confiscated all USB keys and threw them away, with quarterly audits to make sure no USB keys appear. The systems are still not being updated and laptops with Internet connection are still indirectly bridging them.
The network has been subnetted into departments. Problem: I, from development, get calls from service about devices that have issues. Before the subnetting, they simply told me the serial number, and I let my army of diagnosis tools hit the unsuspecting device to get an idea what's up with it. Now they have to bring it over and set up all the attached devices here so I can run my tests.
Worked at a medium sized retail startup as a software engineer where we didn't have root access to our local laptops, under the guise of "if you fuck it up we won't be able to fix it" but we only started out with a basic MacBook setup. so every time I wanted to install a tool, ide, or VM I had to make a ticket to IT to come and log in with the password and explain what I was doing.
Eventually, the engineering dept bribed an IT guy to just give us the password and started using it. IT MGMT got pissed when the number of tickets dropped dramatically and realized what was going on.
We eventually came to the compromise that they gave us sudo access with the warning "we're not backing anything up. If you mess up we'll have to factory reset the whole machine". Nobody ever had to factory reboot their machine because we weren't children... And if there was an issue we just fixed it ourselves
A long time ago in a galaxy far away (before the internet was a normal thing to have) I provided over-the-phone support for a large and complex piece of software.
So, people would call up and you had to describe how they could do the thing they needed to do, and if that failed they would have to wait a few days until you went to the site to sort it in person.
The software we supported was not on the approved list for the company I worked for, so you couldn't use it within the building where the phones were being answered.
There was a server I inherited from colleagues who resigned, mostly static HTML serving. I would occasionally do a apt update && apt ugrade to keep nginx and so updated and installed certbot because IT told me that this static HTML site should be served via HTTPS, fair enough.
Then I went on parental leave and someone blocked all outgoing internet access from the server. Now certbot can't renew the certificate and I can't run apt. Then I got a ticket to update nginx and they told me to use SSH to copy the files needed.
Endless approval processes are a good one. They don't even have to be nonsensical. Just unnecessarily manual, tedious, applied to the simplest changes, with long wait times and multiple steps. Add time zone differences and pile up many different ones, and life becomes hell.
Ours is terrible for making security policy that will impact technical solution options in a vacuum with a few select higher level IT folks and no one sorts out the process to using the new "secure" way first. Ending up in finding out something you thought would be a day or 2 task ends up being a weeks long odyssey to define new processes and technical approaches. Or sometimes just out right abandoning the work because the headache isn't worth it.
Access to change production systems was limited to a single team, which was tasked with doing all deploys by hand, for an engineering organisation of 50+ people. Quickly becoming overloaded, they limited deploy frequency to five deploys per day, organisation-wide.
They forbid us to add our ssh keys in some server machines, and force us to log in these servers with the non-personal admin account, with a password that is super easy to guess and haven't been changed in 5 years.
The IT company I work for purchased me, along with some number of my coworkers and our product line from my former employer. Leading up to the cut over, we’re told that on midnight of the change, our company email will stop working. No forwarders or anything. BUT, we will get a new email that consists of gibberish@stupidsubdomain.company.com. When the password on this new account expires, because we can’t change it because we’re no longer employees, we have to go to a website to request a password change. This emails us a link to our new company email address, but we can’t use that link. We have to manually change part of the URL for it to work. I had them manually change my password twice before I gave up on the whole process. Figured I didn’t work for them anymore. What would they do if I stopped using this bogus account/email address, fire me?
I used to work with a guy who glued the USB ports shut on his labs. I asked him why he didn't just turn them off in BIOS and then lock BIOS behind a password and he just kinda shrugged. He wasn't security, but it's kinda related to your story.
¯\_(ツ)_/¯
Security where I work is pretty decent really, I don't recall them ever doing any dumb crazy stuff. There were some things that were unpopular with some people but they had good reasons that far outweighed any complaints.
I got to say after reading a couple stories here I can understand the frustrations and some very legitimate stories here make a lot of sense in the context of it teams fucking up. but I also think there's a lot of ignorance about what people are actually trying to accomplish in some of these stories as somebody that does it security and a lot of compliance work sometimes we're doing these things because we have to not so much that we want to.
Very short screensaver timeouts, useless proxy, short timeouts from intranet pages, disabled browser extensions, to make impossible to automatize our very repetitive work, daily DB access requests for work, etc.
Our IT mandated 15 character long passwords. Many people in manufacturing (the guys who make the stuff we produce or setup and fix the machines) have the passwords in the format: "Somename123456..." You get the picture. When the passwords are forced to change? Yeah, just add "a,b,c,d..." at the end. Many have it written down on some post-it note on the notebook or desk. Security my ass.
I wouldn't be surprised if I found that office guys have it too.
I dunno, gluing usb's in a super sensitive environment like that is actually logical; on the disc drives - they could disable autoplay as well though removing or gluing them closed would be preferable. USB is just such an easy attack vector where the individual plugging it in may not have skills themselves - it might be easier to bribe cleaning folks for example - or inject a person into a cleaning team. Ideally they would attack multiple nodes of your target's network via as many avenues as possible; which makes the network and vpn thing just silly indeed; perhaps they were waiting for someone to try something with excellent infosec / firewalls / traffic shaping. yeeeeah lol.
I'm sure there are more elegant ways they could have disabled the USB ports, but this might have been partially to avoid users being able to accidentally compromise their device by sticking a thumb drive they found in the parking lot in to see what was on it. For exfiltration and VPN usage over the network there are other controls they can/likely had put in place that you may just not have known about
This is what i have to do to log into microsoft fuckin teams on my work laptop when i work from home...
Unencrypt my laptop hardrive
Log into my OS
Log into the VPN
Log into teams
Use the authenticator app on my phone to enter the code that is on my screen
Use my fingerprint on my phone to verify that i am the person using my phone...
Step 5 was introduced a few months ago because the other steps weren't secure enough. This is why half my colleagues aren't available when they work from home...
I suggested that we just use slack as our work chat and leave teams as a red herring to dissapoint extremely talented hackers.
Haha, I never thought of this but...I WAS the IT department in a previous life. I never really thought about how none of this shit really affected me. Granted, I'd have everyone using Yubikeys+Password for logins if I were in charge now.