CFAUnit 7
Unit 76 hours · CACS 101

Contemporary Technology

AI, Machine Learning, Blockchain, IoT, Cloud & VR/AR

🤖

Artificial Intelligence & Applications

Machines that simulate human reasoning and learning

Artificial Intelligence (AI)

The branch of computer science that aims to build machines capable of performing tasks that would normally require human intelligence — such as understanding language, recognising patterns, making decisions, and learning from experience. The term was coined by John McCarthy at the Dartmouth Conference in 1956.

Brief History of AI

1950Alan Turing proposes the Turing Test — a machine passes if a human cannot distinguish its responses from a human's.
1956John McCarthy coins the term "Artificial Intelligence" at the Dartmouth Conference. AI established as a formal discipline.
1997IBM's Deep Blue defeats world chess champion Garry Kasparov — first AI to beat a reigning champion at chess.
2011IBM Watson wins Jeopardy! against human champions, demonstrating natural language understanding at scale.
2012Deep learning breakthrough — AlexNet wins ImageNet competition by a huge margin, triggering the modern AI revolution.
2022ChatGPT (OpenAI) launched — demonstrates large language models capable of human-quality conversation across virtually any topic.

Types of AI

Narrow AI (Weak AI)

Designed to perform one specific task with superhuman accuracy but cannot generalise to other domains. All current AI systems are Narrow AI.

eg: Google Search, Siri, Netflix recommendations, spam filters, facial recognition, self-driving cars.

General AI (AGI)

Hypothetical AI that can understand, learn, and apply intelligence across any task a human can do — switching between domains as needed. Does not yet exist.

eg: A machine that could write code, diagnose diseases, compose music, and drive a car — all with equal competence.

Super AI

Theoretical AI that surpasses the best human intellect across every field — science, creativity, social skills, emotional intelligence. Entirely speculative at present.

eg: A system that could independently solve climate change, cure all diseases, and advance all human knowledge simultaneously.

AI Applications by Domain

Healthcare

Medical imaging analysis (detecting tumours in X-rays and MRI scans with accuracy exceeding human radiologists), drug discovery, robotic surgery (da Vinci robot), and AI-powered early disease diagnosis from patient data.

Education

Adaptive learning platforms that adjust difficulty based on student performance, intelligent tutoring systems, automated essay grading, plagiarism detection, and personalised study recommendations.

Transportation

Self-driving cars (Tesla Autopilot, Waymo), AI-optimised traffic signal control, predictive maintenance for aircraft engines, and route optimisation for logistics companies.

Robotics

Industrial robots for assembly lines, warehouse automation (Amazon Kiva robots), surgical robots, agricultural robots for planting and harvesting, and humanoid robots for care facilities.

Customer Service

AI chatbots and virtual assistants (Siri, Alexa, Google Assistant) that handle millions of customer queries without human agents, with 24/7 availability.

Entertainment

Netflix and YouTube recommendation engines, AI-generated music and art, deepfake video technology, game AI opponents, and personalised social media content feeds.

AI in Everyday Life: Google Search (ranks billions of pages using PageRank + ML) · Gmail spam filter (blocks 99.9% of spam) · Siri/Alexa/Google Assistant (voice recognition + NLP) · Netflix (saves $1 billion/year in churn through personalised recommendations) · Google Maps (real-time traffic prediction) · Face ID on smartphones (facial recognition) · Credit card fraud detection (flags suspicious transactions in milliseconds).

📊

Machine Learning & Neural Networks

How computers learn from data without explicit programming

Machine Learning (ML)

A subset of AI in which systems automatically learn and improve from experience (data) without being explicitly programmed for every situation. Instead of writing rules manually, ML algorithms discover patterns in training data and use those patterns to make predictions or decisions on new data.

Deep Learning

A subset of machine learning that uses artificial neural networks with many hidden layers (hence 'deep') to learn complex, hierarchical representations from raw data. Responsible for breakthroughs in image recognition, speech recognition, language translation, and generative AI.

Types of Machine Learning

Supervised Learning

The model is trained on labelled data — each training example has a known correct answer (label). The model learns to map inputs to outputs by minimising prediction errors on the training set.

eg: Email spam detection, house price prediction, image classification (cat vs dog), medical diagnosis.

Unsupervised Learning

The model is given unlabelled data and must discover hidden structure or patterns on its own. No correct answers are provided — the algorithm identifies natural groupings or representations.

eg: Customer segmentation, topic modelling in documents, anomaly detection, dimensionality reduction.

Reinforcement Learning

An agent learns by interacting with an environment, taking actions, and receiving reward (positive) or penalty (negative) signals. It learns to maximise cumulative reward over time through trial and error.

eg: Game-playing AI (AlphaGo, OpenAI Five), robot locomotion, self-driving car decision making, algorithmic trading.

Artificial Neural Networks (ANN)

Inspired by the structure of the human brain, which contains approximately 86 billion neurons connected by trillions of synapses. An ANN consists of layers of interconnected artificial neurons (nodes) that process and pass information forward.

Input Layer

Receives the raw input data. Each node represents one input feature. Example: for a 28×28 pixel image, the input layer has 784 nodes, one per pixel.

Hidden Layers

One or more intermediate layers that learn abstract representations of the data. Each layer detects increasingly complex features. Deep networks have many hidden layers.

Output Layer

Produces the final prediction or classification. For a 10-digit recogniser, the output layer has 10 nodes — one per digit — each giving a confidence score.

Deep Learning — Real World Achievements

Image Recognition

Convolutional Neural Networks (CNNs) classify images with over 97% accuracy on benchmark datasets, surpassing human-level performance. Used in Google Photos, medical imaging, quality control in manufacturing.

Speech Recognition

Recurrent Neural Networks (RNNs) and Transformers power Siri, Alexa, and Google Assistant, converting spoken audio to text with word error rates below 5% in ideal conditions.

Language Translation

Google Translate uses deep learning Transformer models to translate between 100+ languages. Translation quality improved more in 2016 (with deep learning) than in the previous 10 years combined.

Generative AI

Generative Adversarial Networks (GANs) and large language models (GPT-4, Claude) can generate photorealistic images, write code, compose music, and produce human-quality text from a simple text prompt.

Blockchain Technology & Bitcoin

Decentralised, tamper-proof distributed ledgers

Blockchain

A distributed digital ledger that records transactions across a peer-to-peer network of computers (nodes). Data is organised into blocks, and each block is cryptographically linked to the previous one — forming a chain. No single authority controls the ledger; all participating nodes hold identical copies. Introduced by the pseudonymous Satoshi Nakamoto in the 2008 Bitcoin whitepaper.

Distributed Ledger

A database that is consensually shared, replicated, and synchronised across multiple sites, institutions, or geographies. Unlike a traditional centralised database controlled by one organisation, a distributed ledger has no central administrator — all participants maintain identical records.

Key Features of Blockchain

Decentralised

No single central authority controls the blockchain. Thousands of nodes around the world each hold a complete copy of the entire ledger. There is no single point of failure or control.

Immutable

Once a block is added to the chain, its data cannot be altered without recalculating every subsequent block's hash — which would require controlling more than 50% of the entire network's computing power (a 51% attack), making tampering computationally infeasible.

Transparent

All transactions on public blockchains (like Bitcoin) are visible to anyone who wants to inspect the ledger. Every transaction is permanently recorded with a timestamp and can be independently verified by any participant.

Secure (Cryptographic)

Each block contains a cryptographic hash (SHA-256 for Bitcoin) of its own data and the hash of the previous block. Any attempt to modify a block changes its hash, immediately breaking the chain and alerting all nodes.

How Blocks and Chains Work

Each block in the chain contains three essential components:

Block Header

Contains: block number, timestamp, nonce (proof-of-work value), and the cryptographic hash of the previous block.

+

Transaction Data

The actual payload — a list of all transactions (sender, receiver, amount) included in this block.

+

Hash Pointer

The SHA-256 hash of this block's entire content, which will be included in the next block — creating the unbreakable chain.

Bitcoin & Cryptocurrency

Bitcoin (BTC) is the first decentralised cryptocurrency, created by Satoshi Nakamoto in 2009. It uses blockchain to record all transactions without needing a bank or any trusted central authority. New bitcoins are created through a process called mining — competing to solve computationally difficult cryptographic puzzles. The total supply is capped at 21 million bitcoins, making it deflationary by design.

Smart Contracts & Applications Beyond Cryptocurrency

Smart Contract

Self-executing programs stored on a blockchain that automatically enforce and execute the terms of an agreement when predefined conditions are met — with no need for intermediaries like lawyers, banks, or notaries. Pioneered by the Ethereum blockchain (Vitalik Buterin, 2015).

Application AreaHow Blockchain HelpsExample
Finance & BankingCross-border payments without intermediary banks, faster settlement, lower feesRipple (XRP) for international money transfers
Supply ChainTrack goods from origin to consumer with an auditable, tamper-proof trailWalmart tracking lettuce from farm to shelf in seconds (vs 7 days)
HealthcareSecure sharing of patient records across hospitals while maintaining privacyMedRec — patient medical records on Ethereum
Voting SystemsTransparent, auditable, tamper-resistant digital voting that eliminates fraudWest Virginia used blockchain voting for overseas military in 2018
NFTs & Digital OwnershipProve ownership of unique digital assets (art, music, game items) on a public ledgerBeeple's digital artwork sold for $69 million as an NFT

IoT & Cloud Computing

Connected physical devices and computing delivered over the internet

Internet of Things (IoT)

Internet of Things (IoT)

A network of physical devices, vehicles, appliances, and other objects embedded with sensors, software, and connectivity that enables them to collect data, communicate with each other, and be controlled remotely over the internet — without requiring direct human interaction.

Smart Home

Amazon Echo (Alexa), Google Nest thermostat, smart security cameras, connected door locks, automated lighting systems, smart fridges that order groceries.

Wearables

Apple Watch and Fitbit tracking heart rate, blood oxygen, sleep patterns, and ECG. Smart glasses, smart clothing with biometric sensors for athletes.

Smart City

Intelligent traffic lights that adapt to real-time congestion, connected streetlights that dim automatically, smart waste bins that signal when full, air quality sensors.

Healthcare

Remote patient monitoring devices, connected insulin pumps, smart inhalers, implantable cardiac monitors that transmit readings directly to cardiologists.

Agriculture

Soil moisture sensors that trigger precision irrigation only when needed, drone crop monitoring, livestock tracking with GPS ear tags, weather stations in fields.

Industry (IIoT)

Predictive maintenance sensors on factory machines that detect vibration anomalies before failure, connected assembly line robots, real-time supply chain tracking.

Cloud Computing

Cloud Computing

The delivery of computing services — including servers, storage, databases, networking, software, analytics, and AI — over the internet ('the cloud') on a pay-as-you-use basis. Instead of owning and maintaining physical hardware, users rent resources from cloud providers. Eliminates the need for upfront capital expenditure on IT infrastructure.

Cloud Service Models

IaaS — Infrastructure as a Service

Provides virtualised computing resources over the internet — virtual machines, storage, networking, and operating systems. The customer manages everything above the hardware layer.

eg: AWS EC2 (virtual servers), Google Compute Engine, Microsoft Azure Virtual Machines, Amazon S3 (storage).

Provider: Physical hardware. You: OS, middleware, runtime, applications.

PaaS — Platform as a Service

Provides a managed platform with runtime environments, databases, development frameworks, and deployment pipelines. Developers focus entirely on writing code — no server management needed.

eg: Google App Engine, Heroku, Microsoft Azure App Service, AWS Elastic Beanstalk.

Provider: Hardware + OS + runtime. You: Application code and data only.

SaaS — Software as a Service

Delivers complete software applications over the internet via a web browser or app. The provider manages everything — hardware, OS, runtime, and the application itself. Users just log in and use.

eg: Gmail, Google Docs, Microsoft 365, Dropbox, Salesforce CRM, Zoom, Netflix.

Provider: Everything. You: Your data and user settings only.

Cloud Deployment Models

ModelDescriptionWho Uses ItExamples
Public CloudInfrastructure owned and operated by a third-party cloud provider, shared among many customers over the public internetStartups, individuals, small businesses, any organisation wanting zero upfront costAWS, Google Cloud, Microsoft Azure
Private CloudCloud infrastructure operated solely for one organisation, either on-premises or hosted by a provider. Greater control and securityBanks, hospitals, government agencies, enterprises with strict compliance requirementsOn-premise VMware, IBM Cloud Private
Hybrid CloudCombines public and private clouds, allowing data and applications to move between them based on workload sensitivity and costLarge enterprises that want cloud flexibility but need to keep sensitive data on-premisesNetflix (AWS + own CDN), Healthcare systems
🥽

Virtual & Augmented Reality

Immersive and overlay technologies reshaping human experience

Virtual Reality (VR)

A completely computer-generated, immersive three-dimensional environment that replaces the user's real-world sensory experience. Users wear a VR headset that tracks head movement and displays a stereoscopic digital world, creating the sensation of physical presence inside the virtual environment.

Augmented Reality (AR)

A technology that overlays digital content (images, text, 3D models, sound) onto the user's real-world view in real time, without replacing the real world. The user sees both physical surroundings and digital enhancements simultaneously through a smartphone camera or transparent AR glasses.

Mixed Reality (MR)

A deeper integration than AR where digital objects are anchored to and interact with the real-world environment in real time. Virtual objects understand and respond to physical surfaces and lighting. Users wearing MR headsets can manipulate holographic objects that appear to exist in their actual room. Example: Microsoft HoloLens.

Comparison: VR vs AR vs MR

FeatureVirtual Reality (VR)Augmented Reality (AR)Mixed Reality (MR)
Real World Visible?No — completely replaced by digital environmentYes — real world is primary; digital overlaid on topYes — real and digital are integrated and interact
Immersion LevelFully immersive — complete sensory replacementPartial — digital elements add to real experienceHigh — digital objects anchored to real space
Required HardwareVR headset (Oculus Quest, PlayStation VR, HTC Vive)Smartphone/tablet camera or lightweight AR glassesAdvanced MR headset (Microsoft HoloLens, Magic Leap)
Processing DemandHigh — full environment rendering at 90+ fpsModerate — real-time object tracking and overlayVery high — simultaneous environment mapping + rendering
Key ApplicationsGaming, training simulations, virtual tourism, therapyNavigation overlays, retail try-on, industrial manualsMedical training, architectural design, industrial assembly
ExamplesMeta Quest 3, PlayStation VR2, HTC Vive ProPokemon Go, Snapchat/Instagram filters, IKEA Place appMicrosoft HoloLens 2, Magic Leap One

Applications by Sector

Gaming

VR gaming delivers fully immersive experiences where players physically move, duck, and interact with the game world. Beat Saber, Half-Life: Alyx, and VRChat are among the most popular VR titles. AR games like Pokemon Go brought AR to 500 million downloads.

Medical Training

Medical students practise surgeries in VR without risk to patients. Surgeons rehearse complex procedures on virtual 3D patient scans. AR overlays anatomical information during real surgeries. AccuVein uses AR to project vein maps onto patient skin for easier IV insertion.

Education

Students take VR field trips to ancient Rome, inside the human cell, or the surface of Mars without leaving the classroom. AR textbooks overlay 3D models of molecules, solar systems, and historical artefacts onto the printed page.

Architecture & Design

Architects walk clients through a virtual building before it is constructed, testing layouts and lighting. IKEA's AR app lets customers place virtual furniture in their real home to check size and style before purchasing.

Military & Defence

Pilots train in VR flight simulators saving millions in fuel and aircraft hours. Soldiers rehearse combat scenarios in virtual environments. AR heads-up displays in fighter cockpits overlay navigation and targeting data.

Retail & E-commerce

Virtual try-on — try clothes, glasses, and makeup in AR before buying. Car companies let customers configure and virtually sit inside vehicles. Real estate tours allow buyers to walk through properties anywhere in the world from their couch.

📝

Analytical Questions

Long-answer model answers for exam preparation

📌

Unit Summary

Core topics, important exam questions & syllabus coverage

Core Topics

AI types — Narrow AI (exists), General AI (future), Super AI (theoretical)

ML types — Supervised, Unsupervised, Reinforcement Learning

Neural Networks — input, hidden, output layers; deep learning

Blockchain — decentralised, immutable, transparent, cryptographically secure

Smart contracts — self-executing code on blockchain (Ethereum)

IoT — physical devices with sensors + internet connectivity

Cloud service models — IaaS, PaaS, SaaS

Cloud deployment — Public, Private, Hybrid

VR vs AR vs MR — replaces vs adds to vs integrates with reality

Applications: AI in healthcare/education, IoT smart home/city, blockchain in supply chain

Important Exam Questions

Q1. What is AI? Explain types of AI with examples.

Q2. What is machine learning? Differentiate supervised, unsupervised, and reinforcement learning.

Q3. What is blockchain? Explain its key features and applications beyond cryptocurrency.

Q4. What is IoT? Give examples of IoT applications in different sectors.

Q5. Explain cloud computing and its service models (IaaS, PaaS, SaaS) with examples.

Q6. Differentiate between Virtual Reality (VR) and Augmented Reality (AR).

Syllabus Coverage Checklist

Artificial Intelligence — definition, history, and types

AI applications in healthcare, education, transportation, robotics

Machine learning — supervised, unsupervised, reinforcement learning

Neural networks — structure and deep learning achievements

Blockchain — features, how blocks and chains work, Bitcoin

Smart contracts and blockchain applications beyond crypto

IoT — definition, components, and applications by sector

Cloud computing — benefits and service models (IaaS/PaaS/SaaS)

Cloud deployment models — public, private, hybrid

VR, AR, and MR — comparison and applications

🧠

How to Remember This Unit

Mnemonics and memory anchors for contemporary tech

AI types: "Now, General AI, Super AI"

N=Narrow AI (exists NOW, one task) · G=General AI (human-level, future) · S=Super AI (beyond human, theoretical) — ordered by capability, each level does not yet exist

ML types: "Sure, Unsure, Reward"

S=Supervised (labelled data, sure about answers) · U=Unsupervised (unlabelled, unsure of categories) · R=Reinforcement (reward signal, learns by doing)

Blockchain 4 features: "DITS"

D=Decentralised · I=Immutable · T=Transparent · S=Secure — all four are interconnected consequences of the hash chain design

Cloud models: "I Provide Software"

I=IaaS (Infrastructure — most control, you manage OS up) · P=PaaS (Platform — you write code only) · S=SaaS (Software — you just use it) — each step abstracts more away

VR vs AR one-liner

VR = Virtual Replaces reality (headset, no real world visible) · AR = Augmented Reality Adds to reality (real world + digital overlay) — "R replaces, A adds"

IoT in one sentence

Any physical object + sensor + internet connection = IoT device. Smart Home → Wearable → Smart City → Healthcare → Agriculture → Factory

Unit 7 Quiz — Contemporary Technology

1. Which type of AI is designed to perform a single specific task, like playing chess or filtering spam?

2. In machine learning, which type of learning uses labelled training data?

3. What is the key feature that makes blockchain data tamper-resistant?

4. Which cloud service model provides a platform for developers to build and deploy applications without managing servers?

5. Which technology overlays digital content onto the real world without fully replacing it?

BCAStudyHub

Your complete interactive study guide for TU BCA Semester I — covering all subjects with interactive tools, past papers, and exam prep.

TU BCASemester I

Program Info

University
Tribhuvan University
Program
BCA — Bachelor in Computer Application
Semester
I (First)
Subjects
5 (4 live, 1 coming soon)

Made by SawnN