Reducing Grunt Work
Whether a scientist takes due diligence front-loading proposal work, which upon the preponderance of evidence suggests that there's a foothold on which to stand lingering there in the aether beyond; or an investigator operates on pure serendipity, repetitively making materials and trying conditions until a set of parameters sticks; either way, there's a considerable amount of ad nauseam overhead that can build up when pursuing the completion of a research project. As a researcher, it's important to be able to fuel your sense of creativity when it comes to your science. Unfortunately, the inevitable routine grunt work that is characterization can squash that creativity in a heartbeat if you're not careful. The research project which I personally joined when arriving at UMKC was one with such tedious quantity of overhead. Most of the overhead however was solved by implementing the following guideline, and it's allowed me to spend more time in the lab having fun:
Offset as much work as you can to your computer.
I.E. learn to code.
My work with Microwave Absorbing Materials can essentially be boiled down to the following equation:
Where the materials which are good microwave absorbers will generate reflection loss values via the given formula of at least -10db. The input parameters, complex permittivity and permeability, frequency, and thickness, are derived experimentally, where the complex parameters as a function of frequency are given in tabulated form so to calculate the RL value. Given such, to do a full analysis one would compute RL value to the order of a product of the number of frequency values tested and the thickness values passed through the d variable. For 400 frequency points between 1 and 18GHz, and for 39 thickness values between 0.1mm and 4mm, the resulting calculation would require 15,600 iterations.
That's a lot of iterations.
Luckly, this is an example of where a few lines of code can offload all of this busy-work off to a computer. A programming language such as Python can be utilized so to define the previous function as a simple iterative computation not unlike the following:
for i in np.arange (0, 4, 0.1):
for k in np.arange(0, freq.size)
It would take me forever to do the same number of calculations that my computer here ran in 20 seconds. And furthermore, no artistic training in the world could allow me to visualize z, y, and x in the same manner which can be done by the Python package Matplotlib:
There is a fair bit of extra coding that goes into generating such an image, but nevertheless, the point at which I'm trying to get at is that when undergoing a research project, it would be extremely beneficial to utilize mechanisms of order so you can maximize the time you have for freedom to be creative.
There's a lot more that you can do with computers than simply iterative computations. But regardless of how you use this tool, utilizing the full capacity of a computer is an extremely powerful way to ease the burden of grunt work. When you're able to do such, it gives you freedom to spend more time at the fume hood seeing if you can discover something new.
If you're a scientist interested in picking up a coding language, consider looking into working with Python. The code, developing environment, packages, and tutorials are mostly if not all open source thus free to use. If you want to get started quickly, I'd recommend the data science platform Anaconda as a quick way to get everything installed so to be up and running in no time. Furthermore I'd also suggest working in the interactive development environment called Spyder, it allows you to edit, test, debug your code relatively quickly, and it comes included when you download Anaconda. I've included some helpful links below.