Over the last few months (or should I even say years?), frontend development has become much more advanced, mature and many new tools have been popping up. One of these tools is the build tool Grunt, which helps you run certain automated tasks, like e.g. compiling SCSS files, minifying CSS or JS files, moving compiled files to another directory and so on.
After seeing people use Grunt more and more and also noticing that a lot of projects on Github use it, I decided to finally give it a shot and dive into it. I have to say it all sounds much more complicated than it actually is and after a couple hours of playing around with it, I have gotten my head around the basics and included Grunt in a first project.
I basically started with the following two articles, plus checking out a few projects and their Gruntfiles on Github to get a better idea on the real world usage (even these were way over the top for a Grunt beginner, but it still helps to get a better idea).
It already took place a few weeks ago, but there were some great talks at the Chrome Dev Summit 2013, especially on browser and web performance. Fortunately there are two video recordings available, so we who haven’t been there can watch the talks and I can only recommend them!
Have a look at the summit schedule to find the talks you’re interested in, since both videos (unfortunately, but not complaining!) are in one piece, roughly 8 hours each.
‘Layout’ is the process a browser undergoes to calculate the position and size of each element in a document before it can start painting pixels. The process of layout can be costly, especially on low powered mobile devices.
There’s so much stuff happening on the performance side of things these days, it’s interesting to see from how many different angles this can be approached, whereas they all go together in the end. Thus, here’s one more on performance optimization during “layout’.
I have recently started to change my schedule. It mainly started out of curiosity and trying to be more productive. I don’t read life hacker or such, but I’ve read two of Joel Gascoine‘s articles on his schedule before and that also made me curious.
Sometimes in the past I just used to get up early and I noticed that on these days when I got into the office at 5, 6 or 7am I got so much more done in four hours than on other days. It only happened a few times a year, but whenever it did, I liked it a lot and was really surprised how much you can get done before noon.
Now I’ve been on the getting up at 6am schedule since more than 2 weeks. It’s been great and funnily enough I enjoy it a lot. I’m more productive and I feel snappier, more energetic overall. Plus I get more things done, which is great.
I want to try to tweak this schedule a lot and I have to learn to actually get out of the office at some point – which has proven difficult – and I’ve just been working much longer hours. Good for the short term, but definitely needs adjustment for the long run.
Today I came across this and it made me smile. Seems everything done right ;)
Some time ago I turned off Spotlight all the way, since I rarely need it and it’s rather annoying, since it does make the fans go crazy more often than not. But then, other times I think it’s helpful and I go back to turning it on again. Until the next time when too much fan noise… and so the story goes.
Here are the commands just in case you do want to do the same:
The primary method by using launchctl which loads/unloads the Spotlight mds agent into launchd.
This command requires your admin password.
I just needed to download another disk image from modern.ie to test a new site I’m working on in Internet Explorer. Most of these downloads are rather large files and can take some time to finish. Recently the internet connection hasn’t been too great at the office as well as at home, plus I was moving around quite a bit, so there was never enough time to complete a download and I always ended up with incomplete files. I always restarted the download which overwrote the partially downloaded files – what a waste of time and energy. I took a look at the man pages for curl and did a little bit of googling around to finally come up with a simple solution that let’s you resume a partial download via curl.
Since I never like when Terminal commands’ options aren’t explained (sometimes they might be a bit confusing), see the explanation below:
-C, --continue-at <offset> Continue/Resume a previous file transfer at the given offset. The given offset is the exact number of bytes that will be skipped, counting from the beginning of the source file before it is transferred to the destination. If used with uploads, the FTP server command SIZE will not be used by curl.
-C - Tell curl to automatically find out where/how to resume the transfer. It then uses the given output/input files to figure that out.
-o Write output to instead of stdout.
You can see all possible options for curl by typing man curl in Terminal.