Trying to condense a huge data set across time frames? I have written a function that helps.

For my purposes, I have multiple HOBO loggers that record hourly temperatures. I usually need to condense this though to get either mean temperatures for a day, or alternately, the mean temperature of an hour over a series of days. My data set look like # columns = #loggers and # rows = 24 (hours) * #days.

The functions works simple then. The first two columns of the data set must be the time frames, with the first column being the one you want to condense (in my case, it was by day). Input the dataset into my function

{

avgcons <-function(x,y,var) { sapply(subset(x, data[1] == y, select=var), method)}

rows <- max(x[2])

cols <- ncol(x)-2

i <- 1

j <- 3

n <- 1

names <- colnames(x[3:ncol(x)])

my.mat <- matrix(NA,rows,cols)

repeat {

for(i in 1:rows)

{

my.mat[i,n] <- avgcons(data,i,var=j)

}

n <- n+1

j <- j+1

if(n > cols) break()

}

colnames(my.mat) <- names

rownames(my.mat) <- seq(1:rows)

print(my.mat)

}