The MATLAB ecosystem ships with dozens of toolboxes. Today: Control Systems (transfer functions, Bode plots, root locus), Statistics and ML (regression, clustering), and Deep Learning (train a neural network).
The Control Systems Toolbox lets you define LTI (linear time-invariant) systems as transfer functions or state-space models, then analyze them with Bode plots, step responses, root locus, and pole-zero maps.
% Define a second-order transfer function G(s) = 10 / (s^2 + 3s + 10)
% tf(numerator_coeffs, denominator_coeffs)
G = tf([10], [1, 3, 10]);
disp(G);
% Step response
figure;
step(G);
title('Step Response');
grid on;
% Get step response data
[y, t] = step(G);
[max_y, idx] = max(y);
overshoot = (max_y - y(end)) / y(end) * 100;
fprintf('Overshoot: %.1f%%\n', overshoot);
fprintf('Final value: %.3f\n', y(end));
% Bode plot (frequency response: magnitude and phase)
figure;
bode(G);
grid on;
[gm, pm] = margin(G);
fprintf('Gain margin: %.1f dB, Phase margin: %.1f deg\n', 20*log10(gm), pm);
% Root locus (how poles move as gain K varies)
figure;
rlocus(G);
title('Root Locus');
% Design a PID controller using pidtune
C = pidtune(G, 'PID');
disp(C);
% Closed-loop system: feedback(C*G, 1)
T = feedback(C*G, 1);
figure;
step(T, G); % compare open vs closed loop
legend({'Closed-loop (PID)', 'Open-loop plant'});
title('PID Controlled Step Response');
% Discrete-time model (sampling period Ts = 0.1s)
Gd = c2d(G, 0.1, 'zoh'); % zero-order hold discretization
figure; bode(G, Gd); legend({'Continuous', 'Discrete (ZOH)'});
% ── Linear Regression ────────────────────────────────────────────────
rng(42);
x1 = randn(100,1);
x2 = randn(100,1);
y = 3*x1 - 2*x2 + randn(100,1)*0.5;
tbl = table(x1, x2, y);
mdl = fitlm(tbl, 'y ~ x1 + x2');
disp(mdl);
% Predictions and residuals
y_pred = predict(mdl, tbl(:,1:2));
rmse = sqrt(mean((y - y_pred).^2));
fprintf('RMSE: %.4f\n', rmse);
figure;
subplot(1,2,1);
scatter(y, y_pred, 30, 'filled', 'Alpha', 0.5);
refline(1,0); xlabel('Actual'); ylabel('Predicted'); grid on;
title('Actual vs Predicted');
subplot(1,2,2);
histogram(y - y_pred, 20);
xlabel('Residual'); title('Residual Distribution');
% ── K-Means Clustering ────────────────────────────────────────────────
X = [randn(50,2)+2; randn(50,2)-2; randn(50,2)+[2,-2]];
[idx, centroids, sumd] = kmeans(X, 3, 'Replicates', 10);
figure;
gscatter(X(:,1), X(:,2), idx, 'rgb', 'o.x', 8);
hold on;
plot(centroids(:,1), centroids(:,2), 'k*', 'MarkerSize', 14, 'LineWidth', 2);
title('K-Means Clustering (k=3)'); grid on;
% ── SVM Classification ────────────────────────────────────────────────
% Load Fisher's iris dataset
load fisheriris;
X_iris = meas(:,1:2); % just first 2 features
Y_iris = species;
svm_mdl = fitcsvm(X_iris, Y_iris, 'KernelFunction', 'rbf', ...
'BoxConstraint', 1, 'ClassNames', unique(Y_iris));
cv_mdl = crossval(svm_mdl, 'KFold', 5);
loss = kfoldLoss(cv_mdl);
fprintf('5-fold CV error: %.3f\n', loss);
% Train a simple feedforward network on the XOR problem
X = [0 0; 0 1; 1 0; 1 1]'; % 2xN input
Y = [0; 1; 1; 0 ]'; % 1xN output (XOR)
% Define network layers
layers = [
featureInputLayer(2, 'Name', 'input')
fullyConnectedLayer(8, 'Name', 'fc1')
reluLayer('Name', 'relu1')
fullyConnectedLayer(4, 'Name', 'fc2')
reluLayer('Name', 'relu2')
fullyConnectedLayer(1, 'Name', 'fc3')
sigmoidLayer('Name', 'sigmoid')
regressionLayer('Name', 'output')
];
% Training options
options = trainingOptions('adam', ...
'MaxEpochs', 2000, ...
'MiniBatchSize', 4, ...
'InitialLearnRate', 0.01, ...
'LearnRateDropFactor', 0.5, ...
'LearnRateDropPeriod', 500, ...
'Verbose', false, ...
'Plots', 'training-progress');
% Train
net = trainNetwork(X', Y', layers, options);
% Predict
y_pred = predict(net, X');
fprintf('XOR predictions:\n');
disp([X', Y', round(y_pred)])
% For image classification: swap featureInputLayer with imageInputLayer
% and add conv/pool layers before the fully connected layers
% layers(1) = imageInputLayer([28 28 1]);
% layers = [layers(1); convolution2dLayer(3,16,'Padding','same'); ...];
% Check if GPU is available
if canUseGPU()
fprintf('GPU available — training will use it automatically.\n');
end
analyzeNetwork to debug architectures. Before training, call analyzeNetwork(layers) to visualize the network, check layer sizes, and catch dimension mismatches — saves a lot of frustrating runtime errors.G = tf(1, [1, 6, 11, 6]). Plot the Bode diagram, step response, and root locus.pidtune(G, 'PI'). Verify the closed-loop step response meets: overshoot < 10%, settling time < 5s.fitlm to predict the output from lagged inputs (AR model).From matrix indexing and SVD through signal processing, Simulink, and deep learning — you now have a working foundation in the language that runs engineering simulations worldwide.
Take the Live Bootcamp →Completing all five days means having a solid working knowledge of MATLAB in 5 Days. The skills here translate directly to real projects. The next step is practice — pick a project and build something with what was learned.
Before moving on, verify you can answer these without looking:
Live Bootcamp
Learn this in person — 2 days, 5 cities
Thu–Fri sessions in Denver, Los Angeles, New York, Chicago, and Dallas. $1,490 per seat. June–October 2026.
Reserve Your Seat →