On a recent project I was working on, I was helping to build a frontend application. All the data that we used was managed by a partner, backend team that exposed the data through some APIs. That's a fairly standard setup. The weird thing was that in most cases, the APIs provided direct access to the datastore. They didn't do much validation, they just provided a way to get and set the data. In some cases, if particular fields were set, then some magic would happen.
In one particular case, our frontend would allow users to update their profile information. To do that, we would ask the backend for the profile, display it to the user, let them make the changes they wanted, and then we'd send the updated information to be saved by the backend. However, one of these magic fields exists in the profile, and it caused some problems. The magic behind this field worked in such a way that if we saved the exact profile information that we retrieved, then the profile would be broken.
Let's look at that again. If I call the API to retrieve the profile information, then call the save API and give it the exact information I was just provided, it would break the profile. Really? Yep. The expectation was that there was a field that needed to be cleared out before sending the information back to be saved. If it wasn't cleared, then the profile would be broken. Of course, this wasn't documented anywhere, it was just expected that everyone would know this.
Something doesn't sit right about this design. It would make sense that if this field had to be cleared out, then it would either be cleared out by the retrieve API or the save API would give an error if it was present.
Monday, February 15, 2016
Friday, February 5, 2016
Good enough for an OS
To me, OS choice is an interesting topic. Working in the tech industry, I've seen a lot of people that get almost religious about the OS that they choose to run. For me, there's not much of a choice anymore. I work at Microsoft and so I use a lot of Microsoft technologies, including Windows. It hasn't always been this way though.
Growing up, my family used Macs. I remember installing System 6 and 7 and then being excited to try OS 8 when it was finally released. My first job in college had to do with caring for OS9 machines and servers.
I also dabbled on the PC side. I bought my first computer in high school. It was a Pentium machine that I upgraded from Windows 3.1 to 95 (and then put Plus! on it, which was totally awesome). I also had a hacker friend that got me interested in Linux. I started on Slackware and then moved on to Mandrake Linux when that came out. I guess my point is that I tried all types of systems.
During college, and the first few years of real work afterwards, I became a strong Linux user. I contributed to Ubuntu documentation and hung out on IRC in the Linux rooms, helping people get it installed and running.
OS X was fascinating when it came out. I mentioned I was working with OS9 when I started college. I had this thought in the back of my head, that I wish I had blogged about (if blogs were around back then, I might have). I thought that if I were Microsoft or Apple, I would change my OS to build off of an open-source project. If I did that, the kernel would probably stay open source, but I could build some proprietary GUI and common programming infrastructure on top of that. That's basically what OS X did. They took the BSD kernel and made it into Darwin (my thoughts would have been to use the Linux kernel, but BSD makes more sense) and then built their own things on top of it using Next technologies. Great idea!
Unfortunately, the first few versions of OS X were pretty terrible. They shipped with some kind of compatibility layer with OS9 which was much more compatible with existing Mac applications and hardware. I remember trying 10.1, 10.2, 10.3, and even 10.4 and not being very impressed. I finally bought my own Mac after 10.5 came out because it could run Windows if I wanted it to and I was curious what all the fuss was about.
In the background, Windows kept releasing versions. 2000 came out, which was pretty awesome. Then XP, which was nice as well. Then Vista, which was just wrong and I hated it from the start.
That brings us to mid 2009, when something interesting happened. Microsoft released Windows 7, which fixed everything wrong with Vista. OS X 10.5 was actually pretty nice, but 10.6 came out and improved the performance and stability of 10.5. I felt 10.6 was the first one that felt great enough for me to use full time. Linux was a little bit late, I think they came up with something really good with 10.04, which came out in 2010. That was the first Linux distro I could install without fighting to configure X and worrying about finding drivers for hardware.
I remember one day, when I was at work, administering an Ubuntu 10.04 server from my Mac while setting up a Windows 7 machine for a new employee, and I thought - you know, all of these are really good. Linux is great and easy to get working, OS X works just fine, and Windows 7 is the coolest thing from Microsoft in a long time. I could see myself using any of them about equally. I thought we had hit a plateau of some kind.
In many ways, I think we did. They've all kind of been messed up since that peak. Windows 8 and 8.1 were radically different and people hated it (I actually liked them, but that's a separate story). Ubuntu came up with Unity, which drives me crazy - as does Gnome Shell. They've really destroyed the Linux experience for me (not a fan of KDE or XFCE either). I also don't feel that the Mac has changed much since 10.6 other than supporting new hardware. A lot of the things they've added have been more annoying than helpful to me.
That was a long story to say - all OSs have their good and bad. I think they've all reached the point where they are perfectly viable choices. I don't really care anymore, and I'm not strongly in any camp. If you like one more than the other, good for you, use it. I'll use what works for me.
Growing up, my family used Macs. I remember installing System 6 and 7 and then being excited to try OS 8 when it was finally released. My first job in college had to do with caring for OS9 machines and servers.
I also dabbled on the PC side. I bought my first computer in high school. It was a Pentium machine that I upgraded from Windows 3.1 to 95 (and then put Plus! on it, which was totally awesome). I also had a hacker friend that got me interested in Linux. I started on Slackware and then moved on to Mandrake Linux when that came out. I guess my point is that I tried all types of systems.
During college, and the first few years of real work afterwards, I became a strong Linux user. I contributed to Ubuntu documentation and hung out on IRC in the Linux rooms, helping people get it installed and running.
OS X was fascinating when it came out. I mentioned I was working with OS9 when I started college. I had this thought in the back of my head, that I wish I had blogged about (if blogs were around back then, I might have). I thought that if I were Microsoft or Apple, I would change my OS to build off of an open-source project. If I did that, the kernel would probably stay open source, but I could build some proprietary GUI and common programming infrastructure on top of that. That's basically what OS X did. They took the BSD kernel and made it into Darwin (my thoughts would have been to use the Linux kernel, but BSD makes more sense) and then built their own things on top of it using Next technologies. Great idea!
Unfortunately, the first few versions of OS X were pretty terrible. They shipped with some kind of compatibility layer with OS9 which was much more compatible with existing Mac applications and hardware. I remember trying 10.1, 10.2, 10.3, and even 10.4 and not being very impressed. I finally bought my own Mac after 10.5 came out because it could run Windows if I wanted it to and I was curious what all the fuss was about.
In the background, Windows kept releasing versions. 2000 came out, which was pretty awesome. Then XP, which was nice as well. Then Vista, which was just wrong and I hated it from the start.
That brings us to mid 2009, when something interesting happened. Microsoft released Windows 7, which fixed everything wrong with Vista. OS X 10.5 was actually pretty nice, but 10.6 came out and improved the performance and stability of 10.5. I felt 10.6 was the first one that felt great enough for me to use full time. Linux was a little bit late, I think they came up with something really good with 10.04, which came out in 2010. That was the first Linux distro I could install without fighting to configure X and worrying about finding drivers for hardware.
I remember one day, when I was at work, administering an Ubuntu 10.04 server from my Mac while setting up a Windows 7 machine for a new employee, and I thought - you know, all of these are really good. Linux is great and easy to get working, OS X works just fine, and Windows 7 is the coolest thing from Microsoft in a long time. I could see myself using any of them about equally. I thought we had hit a plateau of some kind.
In many ways, I think we did. They've all kind of been messed up since that peak. Windows 8 and 8.1 were radically different and people hated it (I actually liked them, but that's a separate story). Ubuntu came up with Unity, which drives me crazy - as does Gnome Shell. They've really destroyed the Linux experience for me (not a fan of KDE or XFCE either). I also don't feel that the Mac has changed much since 10.6 other than supporting new hardware. A lot of the things they've added have been more annoying than helpful to me.
That was a long story to say - all OSs have their good and bad. I think they've all reached the point where they are perfectly viable choices. I don't really care anymore, and I'm not strongly in any camp. If you like one more than the other, good for you, use it. I'll use what works for me.
Subscribe to:
Posts (Atom)