Monday, December 15, 2025

'The Selfish Giant' by Oscar Wilde as a Moral Allegory

 The Selfish Giant as a Moral Allegory

Introduction

Oscar Wilde (1854-1900) was a brilliant Irish poet, playwright, novelist who was also a leader of the movement of ‘Art for Art’s Sake’ in literature. His The Selfish Giant is a simple yet profound story that conveys deep moral and spiritual lessons. Written in the form of a fairy tale, this story functions as a moral allegory, where characters and events symbolize abstract ideas such as selfishness, love, repentance, and redemption.

What is Allegory?

An allegory is a story, poem, or picture where characters, events, and settings symbolize deeper, often moral, political, or spiritual meanings, acting as a hidden message beyond the literal story.

The Giant as a Symbol of Selfishness

The Giant represents selfish and self-centered human nature. By building a wall around his garden and forbidding children to enter, he isolates himself from society. This selfish attitude symbolizes human ego, pride, and lack of compassion for others.

The Garden as a Moral Symbol

The garden stands for the Giant’s heart and soul. When it is closed to children, it remains in perpetual winter, symbolizing emotional coldness and spiritual emptiness. The absence of spring reflects the consequences of selfish behavior.

Nature Reflecting Moral State

The seasons in the story act symbolically. Snow, frost, and hail dominate the garden when the Giant is selfish. When he learns to share, Spring, Summer, and Autumn return, showing that kindness and generosity restore harmony and happiness.

Children as Symbols of Innocence and Love

The children represent innocence, joy, and selfless love. Their presence brings life and beauty to the garden. Wilde suggests that true happiness lies in openness, sharing, and human companionship.

The Little Boy as a Christ Figure

The little boy with wounds on his hands and feet symbolizes Jesus Christ. His suffering reflects sacrifice and divine love. Through this figure, Wilde introduces the idea of Christian redemption, teaching that kindness and repentance lead to spiritual salvation.

The Giant’s Moral Transformation

The Giant’s change from selfishness to generosity marks his moral growth. His repentance and love for children bring inner peace. This transformation reflects the moral journey of human beings toward goodness.

Conclusion

Thus, The Selfish Giant is a moral allegory that teaches the consequences of selfishness and the rewards of love, kindness, and sharing. Oscar Wilde effectively uses symbolism and simple narrative to convey that true happiness and salvation come through compassion and selfless love.

 

"The Selfish Giant" by Oscar Wilde: An Introduction and Summary

 "The Selfish Giant" by Oscar Wilde


Oscar Wilde: Novelist, short sotry writer and playwright

One of the leaders of ‘Art for art’s sake’ movement

·      Novel – ‘The Picture of Dorian Gray’,

·      Comedies ‘The Importance of Being Earnest’ and ‘Lady Windermere's Fan’.

·      Short stories such as ‘The Canterville Ghost’, fairy tales including ‘The Happy Prince’,

·      Poem ‘The Ballad of Reading Gaol’.

"The Selfish Giant" by Oscar Wilde (1854-1900) is a story about a giant who, after returning from a long absence, prevents children from playing in his beautiful garden by building a high wall.

“My own garden is my own garden,” said the Giant; “any one can understand that, and I will allow nobody to play in it but myself.” So he built a high wall all round it, and put up a notice-board.

TRESPASSERS WILL BE PROSECUTED

This action causes his garden to fall into a perpetual winter, with seasons changing everywhere else. Eventually, he realizes his mistake, lets the children back in, and becomes kind. The story ends with the giant growing old, and a special, wounded child leads him to a garden in paradise, where he dies happily. 

  • The giant's selfishness: A selfish giant builds a wall around his garden to keep children out after he finds them playing there.
  • Eternal winter: As a result of his selfishness, the garden is plunged into a perpetual winter, frozen with snow and frost.
  • Return of spring: One day, the giant sees that the children have returned through a hole in the wall, and where they play, spring returns.
  • A changed heart: This sight makes him realize the joy of the children and he breaks down the wall, allowing everyone to play freely. He becomes a kind and loving figure.
  • And the Giant’s heart melted as he looked out. “How selfish I have been!” he said; “now I know why the Spring would not come here. I will put that poor little boy on the top of the tree, and then I will knock down the wall, and my garden shall be the children’s playground for ever and ever.” He was really very sorry for what he had done.
  • A final reward: Years later, the now-elderly giant finds a small boy with nail wounds on his hands and feet in a corner of the garden. The boy explains that the garden he is going to is paradise, and the giant dies peacefully, covered in white blossoms, with the children around him. 

 

·      Allegory

·      Allusion

·      Anthropomorphism

·      Symbols – The Tree, The Child

Wednesday, December 10, 2025

Syntax in Computational Linguistics: Oxford Handbook of Computational Linguistics by Ruslan Mitkov

 Syntax in Computational Linguistics

1) Introduction to Syntax

Syntax is the study of how words join together to make sentences.
It tells us:

·        Who is doing the action (subject)

·        What the action is (verb)

·        Who or what receives the action (object)

Example: John visited Mary.

·        John = subject (doer)

·        visited = verb (action)

·        Mary = object (receiver)

In computational linguistics, this can be written like a small formula:

Visit (John, Mary)

1. Why “Mary visited John” means something different

In coding terms:

Visit(John, Mary)   → John visited Mary

Visit(Mary, John)   → Mary visited John

Even though the words are the same, changing the order changes who is doing the action.

Humans understand this naturally.
Computers need rules (syntax rules) to figure this out.

So, syntax gives the structure of a sentence, helping computers understand language.

 

2) Basic Syntactic Concepts

a) Subject – Predicate Relation

Every sentence has:

·        a predicate = usually the verb

·        Subject = the people or things involved (subject, object)

Different languages show these roles differently:

·        English uses word order

o   John (subject) → visited (verb) → Mary (object)

·        Japanese uses case markers

o   John-ga (subject marker)

o   Mary-o (object marker)

Computers must understand these patterns.

b) Phrase Structure

Words group together into phrases that act like one unit.

Example: the tall boy from the park
This whole group = one noun phrase (NP).

Syntax studies:

·        how phrases are built

·        how they can be expanded

·        how one phrase can contain another

c) Ambiguity

A sentence can have more than one structure → two meanings.

Example:

The man from the school with the flag.

Who has the flag?

·        the school?

·        or the man?

Syntax helps computers detect and solve such ambiguities.

 

3) Agreement (Dependency)

Agreement means words must match each other in number, person, gender, etc.

Examples:

This boy is tall
These boy is tall

Problems like this help computers check grammatical correctness.

Example of ambiguity:

Flying planes seem/seems dangerous.

·        If we use seem, “planes” is subject.

·        If we use seems, “flying” becomes the subject.

Agreement helps decide the meaning.

 

4) Valency (Subcategorization)

Different verbs need different numbers of arguments.

·        Intransitive = 1 argument

o   He slept.

·        Transitive = 2 arguments

o   She ate an apple.

Computers use valency to check whether a sentence is complete.

 

5) Embedding and Long-Distance Dependency

One sentence can be inside another sentence.

Example:

The girl that John visited left.

The embedded part that John visited depends on the main sentence.

Deep embedding is difficult for both humans and machines:

The man who said that the woman who knew the teacher who criticized the scholar left early…

Syntax tells computers how to track long-distance relations correctly.

 

6) Conclusion

Syntax is the architecture behind meaningful sentences.

It explains:

  • how verbs control arguments
  • how phrases combine
  • how agreement keeps grammar correct
  • how ambiguity arises
  • how computers can resolve and understand sentences

Modern NLP prefers feature-based and dependency-based models. Without syntax, computers cannot understand or produce meaningful sentences—they can only list words.

Morphology in Computational Linguistics: Ruslan Mitkov's Oxford Handbook of Computational Linguistics

 Morphology in Computational Linguistics


1) Introduction

Ruslan Mitkov a professor of Computing and Communications at Lancaster University has written a book ‘Oxford Handbook of Computational Linguistics’ wherein he has discussed how morphology, syntax, semantics, pragmatics can be applied in NLP in Computational Linguistics.

Morphology is the study of the internal structure of words and the meaningful units that compose them. These units, called morphemes, may be roots, prefixes, suffixes, or grammatical markers that modify meaning or function.

1. Root (Base word):

·        teach → the core meaning is “to instruct.”

2. Prefix:

·        un + happy → unhappy (prefix un- adds the meaning “not”).

3. Suffix:

·        quick + -ly → quickly (suffix -ly changes an adjective into an adverb).

4. Grammatical marker (inflection):

·        walk + -ed → walked (suffix -ed marks past tense).

In computational linguistics, the knowledge of morphology becomes crucial because computers must not only process whole words but also understand how words are formed.

Computational morphology applies techniques of computer science, algorithms, linguistics, and artificial intelligence to automatically analyse (break words into components) and generate (construct surface words from grammatical features) in natural languages. Let’s take an example:

  A computational system takes the word “unhappiness” and automatically analyses it as:

·        un- (prefix meaning “not”)

·        happy (root)

·        -ness (suffix forming a noun)

   

   The same system can also generate a correct surface word. For example, given the features:

·        ROOT: happy

·        PREFIX: un

·        SUFFIX: ness

It will automatically construct the word “unhappiness.

 

This is essential for tasks like machine translation, spell checkers, search engines, speech recognition, document indexing, corpus annotation, and text-to-speech software.

Languages differ greatly in morphology. Isolating languages like Chinese use little affixation, whereas complex languages like Turkish or Finnish contain long words formed from many morphemes. Thus, computational systems must handle diverse patterns of word formation, making morphology a core study area in language technology.

2) Overview of Morphology

Morphology studies how different morphemes combine to form complex words. These morphemes can be:

Free morphemes (can stand alone): book, run, chair

Bound morphemes (cannot stand alone): -ing, -ed, un-, -s

Morphological processes include:

A. Inflection

Changes grammatical properties (tense, number, case) without changing category:

play → played, book → books

B. Derivation

Creates new words or categories:

happy → happiness, teach → teacher

C. Compounding

Joining two free morphemes:

blackboard, sunflower

Some languages add prefix, suffix, infix, or circumfix (Arabic), zero morphology (sheep → sheep), subtractive morphology (Spanish hermano → hermanita). The complexity of these processes requires computers to learn or model many rules for correct analysis and generation.

3) Structure & Ambiguity in Morphology

Ambiguity is one of the greatest challenges in both analysis and generation in natural language processing. Words can be morphologically unclear, meaning one surface form can have multiple analyses. For example:

·      Second (English) can function as noun, ordinal number

·      Okuma (Turkish) can mean reading, don’t read, or to my arrow, depending on morpheme boundaries.

Computational morphology needs to decide the correct meaning based on context. This requires identifying:

·        Correct root or stem

·        Proper affix boundaries

·        Grammatical features (tense, mood, case, etc.)

·        Computational Morphology (Very Simple Explanation)

4. Morphological Analysis (Breaking a word)

·      Computational morphology is about teaching computers how to understand and create different word forms. The computer takes a full word (surface form) and breaks it into:

Root word, Grammatical information (features)

Example:

walked → walk + PAST

(“walked” is the surface form, “walk” is the root, “PAST” is the tense)

·      Morphological Generation (Making a word)

The computer starts with:

Root word, Features (tense, number, person, etc.)

…and creates the correct surface word.

Example:

walk + 3rd person + present → walks

To do this correctly, two things are needed:

A. Morphotactics (Order of morphemes)

These are rules about which pieces of words can join together and in what order.

Example:

In English, you can add -ed after a verb, but you cannot say edwalk.

B. Morphophonemics (also called Morphographemics)

These are the spelling or sound changes that happen when affixes attach.

Examples:

carry + ed → carried (y changes to i)

make + ing → making (drop the e)

In short:

A computational morphology system must understand:

·      Which parts can combine (morphotactics), and

·      How spelling/sound changes happen (morphophonemics).

Only then can a computer correctly break words apart or form new ones.

5) Finite-State Morphology

A Finite-State Transducer (FST) is a simple computer tool used to convert:

  • Lexical level (root + grammar features)
    walk + PAST

into

  • Surface level (the actual word)
    walked

Why FSTs are useful

  • Very fast
  • Can both analyse and generate words
  • Store rules in a small, compact way
  • Handle morpheme order (morphotactics) and
  • Handle spelling changes (morphograph­emics)

6) Handling Morphotactics (Allowed Word Building Rules)

Morphotactics = rules about which morphemes can join together.

Example:

·        Correct: dog → dogs

·        Incorrect: sheeps, boyses → these must be blocked

In FSTs:

  • Each word type (noun, verb, adjective) has its own small dictionary called a sub-lexicon.
  • These sub-lexicons say what is allowed:
    • nouns → can take plural
    • adjectives → cannot take plural

So morphotactics keeps word formation legal and grammatical.

Conclusion

Computational morphology helps computers:

  • understand how words are built
  • know the meaning of different word parts
  • choose the correct word form
  • handle tasks like translation, speech, and text search

By using finite-state methods, rule systems, and modern machine-learning models, computational morphology keeps improving.
This makes language technology more accurate, faster, and better connected to real linguistic knowledge.

 

What is Computational Linguistics? An Introduction by Ralph Grishman

What is Computational Linguistics?

1. Introduction

Ralph Grishman, a computer scientist and professor of linguistics has written a book 'Computational Linguistics: An Introduction' wherein in Chapter 1 he has discussed the basic nature and functions of Computational Linguistics.

  • Linguistics = the scientific study of how language works (how we speak, understand, and create meaning).
  • Computational = using computers, algorithms, and programs to solve problems.

When we combine these two, we get:

Computational Linguistics (CL)

It is the subject where computers are used to study, understand, and produce human language.

It mixes:

  • linguistics
  • computer science
  • artificial intelligence (AI)
  • machine learning
  • psychology
  • engineering

Examples you use daily:

  • Google Translate
  • Siri, Alexa, Google Assistant
  • Chatbots
  • Grammar checkers
  • Search engines

Computational linguistics tries to make computers read, write, listen, speak, and respond like humans.

2. Objectives of Computational Linguistics

Computational linguistics mainly wants to make computers understand natural language.

a) Machine Translation (MT)

Computers translating one language into another
(e.g., Hindi → English).
Earlier it was difficult, but tools like Google Translate are improving.

b) Information Retrieval (IR)

Finding the right information from large data
(e.g., Google Search).
Perfect accuracy is hard because language is complex.

c) Man–Machine Interfaces

Talking to computers in normal language
(e.g., chatbots, voice assistants).
Even if you speak imperfectly, the system tries to reply helpfully.

CL also builds tools to test grammar rules and understand how humans process language. It led to new fields like cognitive science and knowledge representation.

3. Computational vs. Theoretical Linguistics

Theoretical Linguistics

  • Studies grammar rules.
  • Makes theories about how language works.

Computational Linguistics

  • Tests these grammar rules in real computer programs.
  • Focuses on building working systems, not just theories.

A rule that works in theory may fail when a computer tries to use it.

4. Computational Linguistics as Engineering

CL is also engineering because it creates real tools.

Important methods:

  • Modularity: break language into small parts (sound, words, meaning).
  • Simple models first: easy to update and expand.
  • Understand paraphrases: many sentences can have the same meaning.
  • Focus on sentences: main unit of communication.
  • Translation through meaning: computers must convert natural language into a form they can understand.

It combines:
science + engineering + AI + linguistics
to build powerful language systems.

5. Main Structure of CL

Computational Linguistics

1.   Language Analysis

o   Sentence Analysis

§  Syntax (grammar)

§  Semantics (meaning)

o   Discourse Analysis (bigger text: paragraphs, conversation)

2.   Language Generation

o   Making the computer produce meaningful language.

6. Conclusion

Computational linguistics is essential today because it helps us talk to machines using normal language.

Applications include:

  • translators
  • search engines
  • chatbots
  • academic tools
  • writing assistants
  • speech-based AI (Alexa, Siri)

It helps us understand human language better and build smart systems that learn from data.

Computational linguistics is the future of how humans and machines communicate.


Points to Ponder

1. Natural Language Processing (NLP)

A part of computational linguistics that teaches computers to understand everyday language.
Used in:

  • grammar correction
  • spam filters
  • chatbot answers
  • summarizing text

2. Machine Translation

Automatic translation between languages.
Uses deep learning and neural networks to improve quality.

3. Speech Recognition

Converts spoken words into text or actions.
Used in:

  • Siri
  • Alexa
  • Google Assistant

Computers must understand accents, speed, and background noise.

4. Corpus Linguistics

Studying large collections of language data (corpora).
Used in:

  • dictionary making
  • language teaching
  • NLP tools
  • translation systems

It helps computers learn real-life language patterns.

5. AI and Linguistics

AI helps computers learn language patterns.
Linguistics gives rules of grammar and meaning.
Together they create:

  • chatbots
  • translation tools
  • predictive typing
  • voice assistants

 

'The Selfish Giant' by Oscar Wilde as a Moral Allegory

  The Selfish Giant as a Moral Allegory Introduction Oscar Wilde (1854-1900) was a brilliant Irish poet, playwright, novelist who was al...