One Factor at a Time (OFAT) and Designed Experiments are two distinct methodologies applied in the field of experimental design and scientific investigations. They each hold unique applications and implications, especially in the context of school science.
OFAT, as indicated by its name, involves changing one factor or variable at a time, while maintaining all other variables constant. The simplicity and intuitiveness of this method make it a popular choice for school science projects where constraints in terms of resources and time may exist (Zuur et al., 2009). For instance, a student examining the impact of light exposure on plant growth might alter the duration of sunlight exposure each plant receives while keeping other variables such as water and soil type constant. OFAT is useful for demonstrating the fundamental concept of variables and the potential outcome of changing one. However, it fails to account for potential interactions between variables, which is a common occurrence in real-world phenomena.
Designed experiments, on the other hand, also known as factorial design or Design of Experiments (DoE), manipulates several factors at once, aiming to observe the effects of each factor and, crucially, the interactions between them (Anderson & Whitcomb, 2016). Using the previous example, a designed experiment could modify both the amount of sunlight and water at the same time, allowing detection of any potential interaction effects. For instance, less water might be necessary if a plant is receiving less sunlight. Designed experiments, while more complex in design and analysis, offer a more accurate and comprehensive understanding of the system under investigation.
Both OFAT and designed experiments follow a systematic approach, and both are rooted in the scientific method – the processes of observing, hypothesizing, experimenting, analysing, and concluding. They each offer paths to knowledge generation or theory confirmation, a shared goal of scientific exploration.
In school science, OFAT is often introduced first as it aligns well with basic understandings of the scientific method and variable manipulation. As learners progress, designed experiments can be introduced, preparing students for higher-level scientific inquiry and providing a more sophisticated toolset for scientific investigations. Understanding both methodologies, along with their strengths and limitations, equips students with a comprehensive understanding of scientific enquiry.
Despite their differences, the most important shared attribute between OFAT and designed experiments is their grounding in the scientific method and their capacity to foster scientific thinking. Regardless of the experimental design chosen, students learn to formulate hypotheses, conduct experiments, and interpret results. They understand that experimentation is a cornerstone of science and that our current body of knowledge is the result of careful, systematic investigations.
**References:**
- Anderson, M. J., & Whitcomb, P. J. (2016). DOE simplified: practical tools for effective experimentation. CRC Press. [Link](https://www.routledge.com/DOE-Simplified-Practical-Tools-for-Effective-Experimentation-Third-Edition/Anderson-Whitcomb/p/book/9781498746490)
- Zuur, A. F., Ieno, E. N., & Elphick, C. S. (2009). A protocol for data exploration to avoid common statistical problems. Methods in Ecology and Evolution, 1(1), 3-14. [Link](https://besjournals.onlinelibrary.wiley.com/doi/full/10.1111/j.2041-210X.2009.00001.x)
Here it is in full (30mins) - and it is surprising how little has changed in my views since then:
]]>
As we approach the Christmas season, STEM clubs often look for festive activities. It can be hard after the inevitable Christmas Crystal Christmas Tree and "making decorations". So how about this.. £1 from your favourite pound shop buys 8 LEDs and a battery holder:
Replacing the white LEDs with RGB flashing LEDs is simple. Add some heat shrink sleeving and it's better than new.
Video make: (yes, I know I needed to lock focus, but I got carried away).
]]>Setting the context for learning is a constant challenge for teachers. Cries of "why are we learning xyz?" ring out when (for example) teaching the chemistry of alkanes and alkenes. Likewise, pick any suitably abstract area of the curriculum and demonstrating the "why" can stretch the creative spark of many a colleague.
As a suitable segue, enter "Curriculum for Wales" where the subsidiarity of teachers is leveraged to allow them to design local curricula which respond to the needs of the young people the actually teach (as opposed to a centrally defined curriculum attempting to meet the needs of all learners).
It was whilst contemplating curriculum design that I happened to need some way of capturing voltage measurements over a protracted period of time (say, many hours). Some active searching later and I was left with either purchasing a USB multimeter (£50+), purchasing some "data logging" kit designed for schools (and by designed, I mean "sanitised" with all the complexity hidden at the cost of £100+) or building something myself from equipment like the Arduino and its clones. So as a use case, I chose the Arduino route.
Speaking of context, one thing that young people like to "investigate" is just about anything under the banner of "The best....". The idea of pitting one item or brand against another seems to hit at some deep seated, visceral feeling that allows them to challenge the accepted authority. There is nothing more passionate than a nine year old who has discovered that there is exactly the same medication in home brand (cheap) aspirin compared to the big brand (more expensive) alternatives --- especially if Mom / Dad swear by the big brand versions. Talk about unleashing mini zealots onto the world.
With this in mind, I have been planning a sequence of learning investigating "The best battery" Now, discussions over exactly what we mean by "best" (cheapest, longest lasting, least likely to leak, price to longevity ratio) the activity needed some way of measuring the lifespan of AA batteries.
We could have used devices such as torches and kept an eye on the bulb - but torches powered by a single AA tend be be LED with booster circuitry and can last for many many hours. I have used those hand held fans in the past as they drain AA's quickly due to the power needs of the motors - but I don't have any to hand.
Designing a circuit to load and drain a AA battery is quite easy. Add a 10Ω resister across the battery and it will draw 150mV. As AA batteries have a capacity of about 1500mAH (to keep the math easy) - a single 10Ω resister would drain it in about 10hrs. 4 x 10Ω in parallel would drain the battery in about 2.5hrs - exactly what would work in class.
But I still needed something to monitor the voltage. I could wire up a voltmeter and have the children check the voltage every 10mins or so over the course of 3hrs. But this is 2021 and there must be a better way.
Physical computing is part of Curriculum for Wales and ALL teachers in Primary are expected to be able to use "physical computing devices" with their children. This is often Microbits and Lego Mindstorms, but in my world, this means Arduino and occasionally Raspberry PI.
The Seeduino is an Arduino compatible and highly capable device that cost about £6 at time of writing. Out of the box it will read / write analogue (what I need) and digital voltages.
You will need:
1 x Seeeduino XIAO (with the headers soldered on - so it can plug into the breadboard)
1 x breadboard
4 x 10Ω resistors (as the "load" for the battery - higher Ω will drain the battery slower as less current will flow -- we want as much current as practical to flow to drain the battery quickest)
2 x 10k Ω resistors (1 connected to pin A1 (input) and one to GRD (ground) of the Seeeduino -- these are for "safety" both to protect against too much current and from the battery being plugged in in reverse)
Make sure that the battery + is connected to the A1 input and the resistors -- this is easiest by connecting the + to the RED power rail of the breadboard.
Once you connect the battery into the circuit it will start to discharge and drain. Even though the battery is rated at 1.5V, under this load, the first reading is likely to be 1.1V (or less) (The 1.5V is the maximum, "open circuit" voltage and not what the battery delivers in actual use. That is an interesting thing to discuss with young people -- is this "false" advertising?)
I now need to tell the Seeeduino what to actually do. These devices are programmed via the Arduino IDE (available here).
See comments in the code...
// the setup routine runs once when you press reset:
void setup() {
// initialize serial communication at 9600 bits per second:
Serial.begin(9600);
}
// the loop routine runs over and over again forever:
void loop() {
// read the input on analog pin 1:
int sensorValue = analogRead(A1);
// Convert the analogue reading (which goes from 0 - 1023) to a voltage (0 - 1.5V):
float voltage = sensorValue * (3.2 / 1023.0); //adjusted to give value seen on multimeter
// print out the value you read:
Serial.println(voltage);
delay(1000);
}
So now when the device is powered up it will sense voltage on pin A1 and print it back to the serial port --- which I can monitor via USB.
Now the Seeeduino is monitoring the voltage, I need some way of picking that up and displaying it on screen. Python is my go to language for this - quick, easy and there is likely to be a library for that somewhere.
My Python development environment is Anaconda and JupyterLabs (but anything will work as long as you can install additional libraries).
My code:
"""
Simple demonstration program to read Arduino data (Seeduino XIAO in my case) and to save into Excel
The Seeeduino is programmed to send the data from analogue pin 1 once per second
Develped under Jupyter Notebook / Windows 10 - could run from console - possibly cross platform
"""
import serial
import serial.tools.list_ports
import time
from datetime import datetime
from openpyxl import load_workbook
myFileName=r'D:\battery.xlsx' #datafile (r' added to allow \ without escaping)
wb = load_workbook(filename=myFileName) #load workbook
ws = wb['Sheet1'] #identify worksheet
#code to identify the port the Arduino is attached to. Assumes only 1 USB serial device attached
#Tested only under Windows10
ports = list(serial.tools.list_ports.comports())
for p in ports:
if 'USB Serial Device' in p.description: #"USB Serial Device" is how Windows refers to Arduin devices
serial = serial.Serial(p.device,9600) #establish the serial port, assumes only one present
time.sleep(1) # allow 1s to setup, probably fine without.
calibration = 0.013 # determined by running with no input (should be "constant" for the device)
#Read the data
for x in range(144): # outerloop for total groups of 5 mins (12=1hr, 144 = 12hours)
data =[] # empty list to store the data
for i in range(300): # 300 readings = 5 mins
b = serial.readline() # read a byte string
string_n = b.decode() # decode byte string into Unicode
string = string_n.rstrip() # remove \n and \r
flt = float(string) # convert string to float
data.append(flt) # add flt to the end of data list
now = datetime.now()
current_time = now.strftime("%H:%M:%S") #convert time to H:M:S format
data_point = (sum(data)/len(data))-calibration # form average over 5 minutes and remove the zero error
data_point = float(int((data_point*1000)+0.5)/1000) # round to 3dp
data_point = current_time + "," + str(data_point) # combine time and data
print (data_point) # print to console to show working
#write data to Excel file
newRowLocation = ws.max_row +1 #identify the lastrow and move down 1
ws.cell(column=1,row=newRowLocation, value=data_point) #put data in cell
wb.save(filename=myFileName) #for integrity save file after each write
#close serial and Excel
serial.close()
wb.close()
Questions to consider:
Battery technology is complex - the chemistry of what's happening here is beyond primary learners. But I opine that doesn't matter. What matters is the process and the stimulation of thinking. Of course, what I really need is other batteries to compare this to.
That's the next job.
]]>
Someone sent me a headache inducing chart recently (I've blurred the schools):
How times have changed. 10 years ago the debate about technology and the impact on our young people was split between the advocates and detractors. The debate on impact of video games and “too much TV" was still fresh. 5 years ago games like Call of Duty, Grand Theft Auto and Halo where getting a bashing. Now its Fortnite and online betting.
Then something changed. With the all pervasive presence of WiFi we started to see initiatives such as BYOD being mentioned as a serious contender for improving engagement and learner outcomes. Now, not an hour goes by without someone on Twitter, FaceBook, Interweb (insert or delete your social media outlet of choice) demonstrating how and why we should be using ICT, WEB 2.0 and “all things mobile” as the basis or framework for all our lessons.
Science teachers the world over, must be familiar with the following situation when teaching reaction rates; you’re then faced with a dilemma:
I know it’s the scientist in me, but before I can totally accept an idea I like to read for myself some evidence supporting an assertion, otherwise, I am operating solely on belief (normally someone else's belief).
At the end of the Easter break, I thought I’d revisit some of the beliefs that seem to be proliferating through educational circles at present. Upfront, I’m not taking a stand here, just making an observation that evidence is either lacking, thin on the ground or (I accept that this might be my issue) that I can’t find much to go on.
1. iPads (or other tablet devices) improve educational outcomes
For sure, I can find details, blog posts and “literature” that links these devices to engagement (especially for boys) but I can’t actually find any hard evidence that links uptake of these devices to actual impact on attainment or outcomes. Now, that might be because these devices have not been in the class for all that long, but nevertheless, investing in such technology solely because it appears to improve attainment, seems to be insufficient evidence.
In a similar vein, the following articles class in this category:
My favourite one in this category is “Our learners prefer to use tablet devices”. Fair enough, but equally, I’m sure learners would prefer to eat chocolate all day or have a 3 hour lunch break.
2. Kids need to learn how to code
(Declaration of conflict of interest here. I buy into this one, even though I know it’s an assertion, based on belief not evidence and that many governments promote this as a key competency (based mostly on economic drivers).
Linking coding and computation skills to potential future employment (as in “they will need these skills later”) seems self evident, but where things move strongly into belief, is when coding gets linked to wider school attainment. Where is the evidence that teaching kids to code or think in a computational manner is linked to wider school/life success?
This assertion seems to fall back into the “engagement”, “boys like it” category of evidence, but I have to admit that equally for this one, I can’t find any evidence to support my personal belief.
In a similar vein, the following articles class in this category:
3. Solo taxonomy improves meta cognition
About 5 years ago, Solo taxonomy burst onto Twitter and teachers seemed spell bound by the inevitable piles of hexagons. Does it actually achieve anything? For sure (as mentioned above) kids like it and can seem to offer engagement in the class. But does it actually lead to attainment improvement or a step change in pedagogy?
In a similar vein, the following articles class in this category:
4. VLEs and Learning Platforms
(Again conflict of interest here as I like the idea / concepts)
Sure, it seems self evident that VLEs and Learning Platforms can make things easier (for staff and teachers) and can improve engagement, but it there any evidence that these things make an impact/difference to the outcomes of our young people?
5. Flipping the class
As a long term advocate of this, I’ve found myself questioning my reason for this recently. Have I confused the benefits of flipping the class on myself (better prep, reusable resources) and the engagement increase (watch videos at home, do homework exercises in class) with the need for actual evidence that my learners are actually achieving more as a result?
Commentary
I’m not being negative in any of these 5 examples above. In fact, just the opposite. I’m looking for evidence to support the use within the class. For sure, any initiative that inspires teachers and pupils, makes life easier and doesn’t have a detrimental impact could be seen as a success. But to be honest, I’ve not really seen that evidence either.
Now, off to work through my justification for a PLC.
]]>