10.7 Trial and Error

trial and error.jpg

Trial and Error[1]

If you haven’t guessed yet already, normalization is at the edge of my ability. I’ve intuitively figured out pieces of it over the years, but I am still lacking in some of the intricacies of its application. 

So my solution on the previous page leave something to be desired.

Despite my own limitations, I wanted to show this material to you because of how important normalization can be. I’ve also included how I approached the topic (and my struggles with it) because I think it is important to model the thought process not just the answers.

This is difficult material. It is fine to struggle. I certainly do. 

The good news is there are entire professions dedicated to this kind of stuff and you don’t need to get anywhere near as good as they are in order to set up a database that works.

You just need to get your databases good enough to function, avoid common errors, and be capable of saving you time.

For me that has often meant dumping my core dataset into an single spreadsheet (not following the principles of normalization) so that it is easy to check it against the original archival records. When it comes time to standardize columns, I bring the material into a database and use the functions I showed you regarding latitude/longitude to add additional material in.

In the future I might just handle this kind of task with v-lookup etc. 

My approach certainly isn’t perfect, but for me it keeps things simple enough for me to share my data with other scholars who don’t use databases and powerful enough to allow me to take advantage of the kinds of repetitive process that database programs really shine for. As you develop your own skills in this, you'll have a chance to figure out what system works best for the tasks in front of you.

[1] https://theycantalk.com/post/150288574840/trial-and-error Links to an external site.