Jump to content

Character (computing)

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Juuitchan (talk | contribs) at 21:38, 19 August 2002 (Why am I first to write this?? Anyway, I might need help. Does a line feed count as a character?). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

In computing, a character is an indivisibly small unit of text. It can be a letter, a word-space, a punctuation mark, a digit, a Chinese character, etc. By "indivisibly small" I mean indivisibly small in the computer's memory. A computer will store the word "ice" as 3 separate characters: "i", "c", "e". However, it will not store the dot of the "i" separately from the stem!

As an example, this sentence contains 111 characters: 79 letters, 6 punctuation marks, 9 digits, and 17 spaces.

Almost all English characters are 1 byte in memory.

See also glyph.