Lex is a program designed to generate scanners, also known as tokenizers, which recognize lexical patterns in text. Lex is an acronym that stands for "lexical analyzer generator." It is intended primarily for Unix-based systems. The code for Lex was originally developed by Eric Schmidt and Mike Lesk.
Lex can perform simple transformations by itself but its main purpose is to facilitate lexical analysis, the processing of character sequences such as source code to produce symbol sequences called tokens for use as input to other programs such as parsers. Lex can be used with a parser generator to perform lexical analysis. It is easy, for example, to interface Lex and Yacc, an open source program that generates code for the parser in the C programming language.
Lex is proprietary but versions based on the original code are available as open source. These include a streamlined version called Flex, an acronym for "fast lexical analyzer generator," as well as components of OpenSolaris and Plan 9.