<div dir="auto">I'm just curious. What problems in kernel involves parsing?</div><div class="gmail_extra"><br><div class="gmail_quote">On Feb 2, 2018 5:01 AM, <<a href="mailto:valdis.kletnieks@vt.edu">valdis.kletnieks@vt.edu</a>> wrote:<br type="attribution"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">On Thu, 01 Feb 2018 11:37:26 -0500, Aruna Hewapathirane said:<br>
<br>
> Somethings are not so obvious like what could possibly be a *.y file or<br>
> *.tc file ? If you type in find -name "*.y" in my case i see:<br>
><br>
> aruna@debian:~/linux-4.15$ find -name "*.y"<br>
<br>
> Now if I pass that to the 'file' command ...<br>
><br>
> aruna@debian:~/linux-4.15$ file `find -name "*.y"` // yes you need those<br>
> back ticks :)<br>
><br>
> ./drivers/scsi/aic7xxx/aicasm/<wbr>aicasm_macro_gram.y: C source, ASCII text<br>
<br>
> So 'file' tells us these are C program files ? Lets verify ? If you 'cat'<br>
> any of these files you will see it is actual C code. Why does it have a<br>
> file extension of .y ?<br>
<br>
Actually, if you look more closely at it, it's *not* actually C code. The 'file'<br>
command makes its best guess, triggered off things like '#include', etc.<br>
<br>
The 'tl;dr' answer is "The *.y are input files for bison, and the *.l are<br>
input files for flex".<br>
<br>
The more detailed explanation, with 50 years of computer history....<br>
<br>
[/usr/src/linux-next] head -20 tools/perf/util/expr.y<br>
/* Simple expression parser */<br>
%{<br>
#include "util.h"<br>
#include "util/debug.h"<br>
#define IN_EXPR_Y 1<br>
#include "expr.h"<br>
#include "smt.h"<br>
#include <string.h><br>
<br>
#define MAXIDLEN 256<br>
%}<br>
<br>
%pure-parser<br>
%parse-param { double *final_val }<br>
%parse-param { struct parse_ctx *ctx }<br>
%parse-param { const char **pp }<br>
%lex-param { const char **pp }<br>
<br>
%union {<br>
double num;<br>
<br>
That's got a bunch of % in unusual places for C, doesn't it? :)<br>
<br>
Let's hit the rewind button back 5 decades or so, when tools for building<br>
programs were just becoming created. And everybody who wanted to write a<br>
compiler for a language, or parsing data that wasn't strict 'ID is in columns<br>
5-12' formatting, or a whole bunch of other stuff, had to write a parser to do<br>
the parsing.<br>
<br>
For those who have never done it, writing lexical scanners and parsers by hand<br>
is a thankless job. I know from experience that the parse table for an LALR<br>
parser for Pascal ends up being essentially a spreadsheet with some 300 rows<br>
and 400 columns that you get to fill in by hand one at a time - and getting one<br>
entry wrong means you have a buggy compiler (I took Compiler Design in college<br>
the last year we had to do it by hand)<br>
<br>
The first few compiled languages (COBOL, FORTRAN, and a few others) also had to<br>
make do with hand-coded parsers. And then in 1958, Algol happened, and it<br>
spawned all sorts of languages - everything from C to PL/I to Pascal and<br>
probably 200+ others (pretty much every language that allows nested<br>
declarations and has begin/end tokens of some sort owes it to Algol). And the<br>
other thing about Algol was that it was a much "bigger" language than previous<br>
ones, so John Backus invented a meta-language called BNF to provide a formal<br>
specification of the syntax.<br>
<br>
(For those who are curious, a EBNF specification for Pascal syntax is here:<br>
<a href="http://www.fit.vutbr.cz/study/courses/APR/public/ebnf.html" rel="noreferrer" target="_blank">http://www.fit.vutbr.cz/study/<wbr>courses/APR/public/ebnf.html</a><br>
<br>
The interesting thing about BNF is that it has these things called "production<br>
rules" which define what legal programs look like - and the test for "legal"<br>
can be done with a parser using a software/mathematical construct called a<br>
"finite state machine" (and the 3 of you who understand the difference between<br>
a context-sensitive grammar and a context-free grammar can stop giggling right<br>
now.. ;)<br>
<br>
So somebody had the bright idea that if you had a formal BNF specification, you<br>
could write a program that would read the BNF, and spit out the C source for a<br>
parser skeleton based on a finite state machine. And thus were born two<br>
programs called 'lex' (a lexical scanner - something that reads the source, and<br>
outputs code that says "Hey, that's the word 'struct'" or "we just saw a<br>
'for"). and another called 'yacc' (Yet Another Compiler Compiler) which did the<br>
higher level "this is a legal function, but *that* right there is a messed-up<br>
'if' statement that has a syntax error" stuff. Oh, and generate output code, too.<br>
<br>
Of course, that was decades ago, and eventually somebody wrote a faster 'lex' -<br>
and thus was born /usr/bin/flex. And yacc needed work, so the improved version<br>
was, of course, called bison (think about it for a bit..)<br>
<br>
<br>
<br>______________________________<wbr>_________________<br>
Kernelnewbies mailing list<br>
<a href="mailto:Kernelnewbies@kernelnewbies.org">Kernelnewbies@kernelnewbies.<wbr>org</a><br>
<a href="https://lists.kernelnewbies.org/mailman/listinfo/kernelnewbies" rel="noreferrer" target="_blank">https://lists.kernelnewbies.<wbr>org/mailman/listinfo/<wbr>kernelnewbies</a><br>
<br></blockquote></div></div>