A lexer will take an input character stream and convert it into tokens.
This can be used for a variety of purposes. You could apply transformations to the lexemes for simple text processing and manipulation.
Or the stream of lexemes can be fed to a parser which will convert it into a parser tree.
If the goal is compilation, then lexical analysis is the first step. Think of it as the lower level step which takes characters and converts them into tokens. The parser is a higher level mechanism whose alphabet consists of tokens (created by the lexer), which it parses and creates a parse tree.
if the goal is text manipulation, then manipulation rules can be applied to the lexemes themselves.