I was feeling kind of uninspired, a couple of the possible posts that I was hoping to do were either too big (“Lets explore all of Fuzzy Logic!”) or the functions that I thought existed apparently don’t (?) (“I could have sworn there was an ifthenelse function…”). So to get over the hump I chose a package at “random” (no, I still have not made an algorithm to choose packages at random, I scrolled through the packages and chose one out without looking at it). {R1magic} seemed like it might have some promise and I liked the name (“maybe it really does magic? Or mimics D&D game setups?”).

{R1magic} supposedly helps with “compressive sampling”, also known as “compressive sensing”, is used to reconstruct a signal when only a few measurements have been taken. It sounds like it involves interpolating between data points to find intermediary points, and one of the listed applications in Wikipedia is MRI and imaging (a lot of the examples involve compression of .jpg images). If you would like to know more, Rice University has a nice page on the subject with LOTS of links (http://dsp.rice.edu/cs). The {R1magic} package lets you do a lot of things (many of which I don’t think I understand). One of the few that I do is the generation of random Gaussian matrices using the function GaussianMatrix(m,n) with m/n being row and column numbers. These could be useful for generating randomly formulated matrices for simulations. While looking through example code, I ran across set.seed() function, which sets a “starting point” for a pseudo-random number generator. I am not sure I fully understand, but I think if you have set a certain seed number, you exclude that number from further iterations, so it effectively becomes a sort of “code” for the pseudo-random number generation. So while changing the seed number it will still be random, it won’t be the same sort of random that you would have gotten with the original. Maybe. What did I just write? #fakestatisticianproblems

One cool function in the {R1magic} package was sparse signal, which lets you randomly generate a dataset where the data will be cruising along until it spikes. You set the number of datapoints and the number of spikes and receive a random dataset output, like below.

> sparseSignal(2000,10)->a

> plot(a)

> lines(a)

I think that this sort of random dataset generation would be useful for testing random models of detection methods. So being able to test my, say, epidemiological model versus a random dataset would be pretty useful.

Honestly, I have no idea what is going on in this package and feel like I am missing the more interesting portions of what it is trying to say. I do think portions of the package could be useful in simulations or testing models versus randomly generated models, but I doubt those were the intended purposes of the package. Still interesting to know whats going on in signal to noise interpretation.