OpenCores
URL https://opencores.org/ocsvn/openrisc_me/openrisc_me/trunk

Subversion Repositories openrisc_me

[/] [openrisc/] [trunk/] [gnu-src/] [gcc-4.2.2/] [gcc/] [cgraphunit.c] - Blame information for rev 298

Go to most recent revision | Details | Compare with Previous | View Log

Line No. Rev Author Line
1 38 julius
/* Callgraph based interprocedural optimizations.
2
   Copyright (C) 2003, 2004, 2005, 2007 Free Software Foundation, Inc.
3
   Contributed by Jan Hubicka
4
 
5
This file is part of GCC.
6
 
7
GCC is free software; you can redistribute it and/or modify it under
8
the terms of the GNU General Public License as published by the Free
9
Software Foundation; either version 3, or (at your option) any later
10
version.
11
 
12
GCC is distributed in the hope that it will be useful, but WITHOUT ANY
13
WARRANTY; without even the implied warranty of MERCHANTABILITY or
14
FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License
15
for more details.
16
 
17
You should have received a copy of the GNU General Public License
18
along with GCC; see the file COPYING3.  If not see
19
<http://www.gnu.org/licenses/>.  */
20
 
21
/* This module implements main driver of compilation process as well as
22
   few basic interprocedural optimizers.
23
 
24
   The main scope of this file is to act as an interface in between
25
   tree based frontends and the backend (and middle end)
26
 
27
   The front-end is supposed to use following functionality:
28
 
29
    - cgraph_finalize_function
30
 
31
      This function is called once front-end has parsed whole body of function
32
      and it is certain that the function body nor the declaration will change.
33
 
34
      (There is one exception needed for implementing GCC extern inline function.)
35
 
36
    - cgraph_varpool_finalize_variable
37
 
38
      This function has same behavior as the above but is used for static
39
      variables.
40
 
41
    - cgraph_finalize_compilation_unit
42
 
43
      This function is called once compilation unit is finalized and it will
44
      no longer change.
45
 
46
      In the unit-at-a-time the call-graph construction and local function
47
      analysis takes place here.  Bodies of unreachable functions are released
48
      to conserve memory usage.
49
 
50
      ???  The compilation unit in this point of view should be compilation
51
      unit as defined by the language - for instance C frontend allows multiple
52
      compilation units to be parsed at once and it should call function each
53
      time parsing is done so we save memory.
54
 
55
    - cgraph_optimize
56
 
57
      In this unit-at-a-time compilation the intra procedural analysis takes
58
      place here.  In particular the static functions whose address is never
59
      taken are marked as local.  Backend can then use this information to
60
      modify calling conventions, do better inlining or similar optimizations.
61
 
62
    - cgraph_assemble_pending_functions
63
    - cgraph_varpool_assemble_pending_variables
64
 
65
      In non-unit-at-a-time mode these functions can be used to force compilation
66
      of functions or variables that are known to be needed at given stage
67
      of compilation
68
 
69
    - cgraph_mark_needed_node
70
    - cgraph_varpool_mark_needed_node
71
 
72
      When function or variable is referenced by some hidden way (for instance
73
      via assembly code and marked by attribute "used"), the call-graph data structure
74
      must be updated accordingly by this function.
75
 
76
    - analyze_expr callback
77
 
78
      This function is responsible for lowering tree nodes not understood by
79
      generic code into understandable ones or alternatively marking
80
      callgraph and varpool nodes referenced by the as needed.
81
 
82
      ??? On the tree-ssa genericizing should take place here and we will avoid
83
      need for these hooks (replacing them by genericizing hook)
84
 
85
    - expand_function callback
86
 
87
      This function is used to expand function and pass it into RTL back-end.
88
      Front-end should not make any assumptions about when this function can be
89
      called.  In particular cgraph_assemble_pending_functions,
90
      cgraph_varpool_assemble_pending_variables, cgraph_finalize_function,
91
      cgraph_varpool_finalize_function, cgraph_optimize can cause arbitrarily
92
      previously finalized functions to be expanded.
93
 
94
    We implement two compilation modes.
95
 
96
      - unit-at-a-time:  In this mode analyzing of all functions is deferred
97
        to cgraph_finalize_compilation_unit and expansion into cgraph_optimize.
98
 
99
        In cgraph_finalize_compilation_unit the reachable functions are
100
        analyzed.  During analysis the call-graph edges from reachable
101
        functions are constructed and their destinations are marked as
102
        reachable.  References to functions and variables are discovered too
103
        and variables found to be needed output to the assembly file.  Via
104
        mark_referenced call in assemble_variable functions referenced by
105
        static variables are noticed too.
106
 
107
        The intra-procedural information is produced and its existence
108
        indicated by global_info_ready.  Once this flag is set it is impossible
109
        to change function from !reachable to reachable and thus
110
        assemble_variable no longer call mark_referenced.
111
 
112
        Finally the call-graph is topologically sorted and all reachable functions
113
        that has not been completely inlined or are not external are output.
114
 
115
        ??? It is possible that reference to function or variable is optimized
116
        out.  We can not deal with this nicely because topological order is not
117
        suitable for it.  For tree-ssa we may consider another pass doing
118
        optimization and re-discovering reachable functions.
119
 
120
        ??? Reorganize code so variables are output very last and only if they
121
        really has been referenced by produced code, so we catch more cases
122
        where reference has been optimized out.
123
 
124
      - non-unit-at-a-time
125
 
126
        All functions are variables are output as early as possible to conserve
127
        memory consumption.  This may or may not result in less memory used but
128
        it is still needed for some legacy code that rely on particular ordering
129
        of things output from the compiler.
130
 
131
        Varpool data structures are not used and variables are output directly.
132
 
133
        Functions are output early using call of
134
        cgraph_assemble_pending_function from cgraph_finalize_function.  The
135
        decision on whether function is needed is made more conservative so
136
        uninlininable static functions are needed too.  During the call-graph
137
        construction the edge destinations are not marked as reachable and it
138
        is completely relied upn assemble_variable to mark them.  */
139
 
140
 
141
#include "config.h"
142
#include "system.h"
143
#include "coretypes.h"
144
#include "tm.h"
145
#include "tree.h"
146
#include "rtl.h"
147
#include "tree-flow.h"
148
#include "tree-inline.h"
149
#include "langhooks.h"
150
#include "pointer-set.h"
151
#include "toplev.h"
152
#include "flags.h"
153
#include "ggc.h"
154
#include "debug.h"
155
#include "target.h"
156
#include "cgraph.h"
157
#include "diagnostic.h"
158
#include "timevar.h"
159
#include "params.h"
160
#include "fibheap.h"
161
#include "c-common.h"
162
#include "intl.h"
163
#include "function.h"
164
#include "ipa-prop.h"
165
#include "tree-gimple.h"
166
#include "tree-pass.h"
167
#include "output.h"
168
 
169
static void cgraph_expand_all_functions (void);
170
static void cgraph_mark_functions_to_output (void);
171
static void cgraph_expand_function (struct cgraph_node *);
172
static tree record_reference (tree *, int *, void *);
173
static void cgraph_output_pending_asms (void);
174
static void cgraph_increase_alignment (void);
175
 
176
/* Lists all assembled variables to be sent to debugger output later on.  */
177
static GTY(()) struct cgraph_varpool_node *cgraph_varpool_assembled_nodes_queue;
178
 
179
/* Records tree nodes seen in record_reference.  Simply using
180
   walk_tree_without_duplicates doesn't guarantee each node is visited
181
   once because it gets a new htab upon each recursive call from
182
   record_reference itself.  */
183
static struct pointer_set_t *visited_nodes;
184
 
185
static FILE *cgraph_dump_file;
186
 
187
/* Determine if function DECL is needed.  That is, visible to something
188
   either outside this translation unit, something magic in the system
189
   configury, or (if not doing unit-at-a-time) to something we havn't
190
   seen yet.  */
191
 
192
static bool
193
decide_is_function_needed (struct cgraph_node *node, tree decl)
194
{
195
  tree origin;
196
  if (MAIN_NAME_P (DECL_NAME (decl))
197
      && TREE_PUBLIC (decl))
198
    {
199
      node->local.externally_visible = true;
200
      return true;
201
    }
202
 
203
  /* If the user told us it is used, then it must be so.  */
204
  if (node->local.externally_visible)
205
    return true;
206
 
207
  if (!flag_unit_at_a_time && lookup_attribute ("used", DECL_ATTRIBUTES (decl)))
208
    return true;
209
 
210
  /* ??? If the assembler name is set by hand, it is possible to assemble
211
     the name later after finalizing the function and the fact is noticed
212
     in assemble_name then.  This is arguably a bug.  */
213
  if (DECL_ASSEMBLER_NAME_SET_P (decl)
214
      && TREE_SYMBOL_REFERENCED (DECL_ASSEMBLER_NAME (decl)))
215
    return true;
216
 
217
  /* If we decided it was needed before, but at the time we didn't have
218
     the body of the function available, then it's still needed.  We have
219
     to go back and re-check its dependencies now.  */
220
  if (node->needed)
221
    return true;
222
 
223
  /* Externally visible functions must be output.  The exception is
224
     COMDAT functions that must be output only when they are needed.
225
 
226
     When not optimizing, also output the static functions. (see
227
     PR24561), but don't do so for always_inline functions, functions
228
     declared inline and nested functions.  These was optimized out
229
     in the original implementation and it is unclear whether we want
230
     to change the behavior here.  */
231
  if (((TREE_PUBLIC (decl)
232
        || (!optimize && !node->local.disregard_inline_limits
233
            && !DECL_DECLARED_INLINE_P (decl)
234
            && !node->origin))
235
      && !flag_whole_program)
236
      && !DECL_COMDAT (decl) && !DECL_EXTERNAL (decl))
237
    return true;
238
 
239
  /* Constructors and destructors are reachable from the runtime by
240
     some mechanism.  */
241
  if (DECL_STATIC_CONSTRUCTOR (decl) || DECL_STATIC_DESTRUCTOR (decl))
242
    return true;
243
 
244
  if (flag_unit_at_a_time)
245
    return false;
246
 
247
  /* If not doing unit at a time, then we'll only defer this function
248
     if its marked for inlining.  Otherwise we want to emit it now.  */
249
 
250
  /* "extern inline" functions are never output locally.  */
251
  if (DECL_EXTERNAL (decl))
252
    return false;
253
  /* Nested functions of extern inline function shall not be emit unless
254
     we inlined the origin.  */
255
  for (origin = decl_function_context (decl); origin;
256
       origin = decl_function_context (origin))
257
    if (DECL_EXTERNAL (origin))
258
      return false;
259
  /* We want to emit COMDAT functions only when absolutely necessary.  */
260
  if (DECL_COMDAT (decl))
261
    return false;
262
  if (!DECL_INLINE (decl)
263
      || (!node->local.disregard_inline_limits
264
          /* When declared inline, defer even the uninlinable functions.
265
             This allows them to be eliminated when unused.  */
266
          && !DECL_DECLARED_INLINE_P (decl)
267
          && (!node->local.inlinable || !cgraph_default_inline_p (node, NULL))))
268
    return true;
269
 
270
  return false;
271
}
272
 
273
/* Walk the decls we marked as necessary and see if they reference new
274
   variables or functions and add them into the worklists.  */
275
static bool
276
cgraph_varpool_analyze_pending_decls (void)
277
{
278
  bool changed = false;
279
  timevar_push (TV_CGRAPH);
280
 
281
  while (cgraph_varpool_first_unanalyzed_node)
282
    {
283
      tree decl = cgraph_varpool_first_unanalyzed_node->decl;
284
 
285
      cgraph_varpool_first_unanalyzed_node->analyzed = true;
286
 
287
      cgraph_varpool_first_unanalyzed_node = cgraph_varpool_first_unanalyzed_node->next_needed;
288
 
289
      /* Compute the alignment early so function body expanders are
290
         already informed about increased alignment.  */
291
      align_variable (decl, 0);
292
 
293
      if (DECL_INITIAL (decl))
294
        {
295
          visited_nodes = pointer_set_create ();
296
          walk_tree (&DECL_INITIAL (decl), record_reference, NULL, visited_nodes);
297
          pointer_set_destroy (visited_nodes);
298
          visited_nodes = NULL;
299
        }
300
      changed = true;
301
    }
302
  timevar_pop (TV_CGRAPH);
303
  return changed;
304
}
305
 
306
/* Optimization of function bodies might've rendered some variables as
307
   unnecessary so we want to avoid these from being compiled.
308
 
309
   This is done by pruning the queue and keeping only the variables that
310
   really appear needed (ie they are either externally visible or referenced
311
   by compiled function). Re-doing the reachability analysis on variables
312
   brings back the remaining variables referenced by these.  */
313
static void
314
cgraph_varpool_remove_unreferenced_decls (void)
315
{
316
  struct cgraph_varpool_node *next, *node = cgraph_varpool_nodes_queue;
317
 
318
  cgraph_varpool_reset_queue ();
319
 
320
  if (errorcount || sorrycount)
321
    return;
322
 
323
  while (node)
324
    {
325
      tree decl = node->decl;
326
      next = node->next_needed;
327
      node->needed = 0;
328
 
329
      if (node->finalized
330
          && ((DECL_ASSEMBLER_NAME_SET_P (decl)
331
               && TREE_SYMBOL_REFERENCED (DECL_ASSEMBLER_NAME (decl)))
332
              || node->force_output
333
              || decide_is_variable_needed (node, decl)
334
              /* ??? Cgraph does not yet rule the world with an iron hand,
335
                 and does not control the emission of debug information.
336
                 After a variable has its DECL_RTL set, we must assume that
337
                 it may be referenced by the debug information, and we can
338
                 no longer elide it.  */
339
              || DECL_RTL_SET_P (decl)))
340
        cgraph_varpool_mark_needed_node (node);
341
 
342
      node = next;
343
    }
344
  /* Make sure we mark alias targets as used targets.  */
345
  finish_aliases_1 ();
346
  cgraph_varpool_analyze_pending_decls ();
347
}
348
 
349
 
350
/* When not doing unit-at-a-time, output all functions enqueued.
351
   Return true when such a functions were found.  */
352
 
353
bool
354
cgraph_assemble_pending_functions (void)
355
{
356
  bool output = false;
357
 
358
  if (flag_unit_at_a_time)
359
    return false;
360
 
361
  cgraph_output_pending_asms ();
362
 
363
  while (cgraph_nodes_queue)
364
    {
365
      struct cgraph_node *n = cgraph_nodes_queue;
366
 
367
      cgraph_nodes_queue = cgraph_nodes_queue->next_needed;
368
      n->next_needed = NULL;
369
      if (!n->global.inlined_to
370
          && !n->alias
371
          && !DECL_EXTERNAL (n->decl))
372
        {
373
          cgraph_expand_function (n);
374
          output = true;
375
        }
376
    }
377
 
378
  /* Process CGRAPH_EXPAND_QUEUE, these are functions created during
379
     the expansion process.  Note that this queue may grow as its
380
     being processed, as the new functions may generate new ones.  */
381
  while (cgraph_expand_queue)
382
    {
383
      struct cgraph_node *n = cgraph_expand_queue;
384
      cgraph_expand_queue = cgraph_expand_queue->next_needed;
385
      n->next_needed = NULL;
386
      cgraph_finalize_function (n->decl, false);
387
      output = true;
388
    }
389
 
390
  return output;
391
}
392
 
393
 
394
/* As an GCC extension we allow redefinition of the function.  The
395
   semantics when both copies of bodies differ is not well defined.
396
   We replace the old body with new body so in unit at a time mode
397
   we always use new body, while in normal mode we may end up with
398
   old body inlined into some functions and new body expanded and
399
   inlined in others.
400
 
401
   ??? It may make more sense to use one body for inlining and other
402
   body for expanding the function but this is difficult to do.  */
403
 
404
static void
405
cgraph_reset_node (struct cgraph_node *node)
406
{
407
  /* If node->output is set, then this is a unit-at-a-time compilation
408
     and we have already begun whole-unit analysis.  This is *not*
409
     testing for whether we've already emitted the function.  That
410
     case can be sort-of legitimately seen with real function
411
     redefinition errors.  I would argue that the front end should
412
     never present us with such a case, but don't enforce that for now.  */
413
  gcc_assert (!node->output);
414
 
415
  /* Reset our data structures so we can analyze the function again.  */
416
  memset (&node->local, 0, sizeof (node->local));
417
  memset (&node->global, 0, sizeof (node->global));
418
  memset (&node->rtl, 0, sizeof (node->rtl));
419
  node->analyzed = false;
420
  node->local.redefined_extern_inline = true;
421
  node->local.finalized = false;
422
 
423
  if (!flag_unit_at_a_time)
424
    {
425
      struct cgraph_node *n, *next;
426
 
427
      for (n = cgraph_nodes; n; n = next)
428
        {
429
          next = n->next;
430
          if (n->global.inlined_to == node)
431
            cgraph_remove_node (n);
432
        }
433
    }
434
 
435
  cgraph_node_remove_callees (node);
436
 
437
  /* We may need to re-queue the node for assembling in case
438
     we already proceeded it and ignored as not needed.  */
439
  if (node->reachable && !flag_unit_at_a_time)
440
    {
441
      struct cgraph_node *n;
442
 
443
      for (n = cgraph_nodes_queue; n; n = n->next_needed)
444
        if (n == node)
445
          break;
446
      if (!n)
447
        node->reachable = 0;
448
    }
449
}
450
 
451
static void
452
cgraph_lower_function (struct cgraph_node *node)
453
{
454
  if (node->lowered)
455
    return;
456
  tree_lowering_passes (node->decl);
457
  node->lowered = true;
458
}
459
 
460
/* DECL has been parsed.  Take it, queue it, compile it at the whim of the
461
   logic in effect.  If NESTED is true, then our caller cannot stand to have
462
   the garbage collector run at the moment.  We would need to either create
463
   a new GC context, or just not compile right now.  */
464
 
465
void
466
cgraph_finalize_function (tree decl, bool nested)
467
{
468
  struct cgraph_node *node = cgraph_node (decl);
469
 
470
  if (node->local.finalized)
471
    cgraph_reset_node (node);
472
 
473
  notice_global_symbol (decl);
474
  node->decl = decl;
475
  node->local.finalized = true;
476
  node->lowered = DECL_STRUCT_FUNCTION (decl)->cfg != NULL;
477
  if (node->nested)
478
    lower_nested_functions (decl);
479
  gcc_assert (!node->nested);
480
 
481
  /* If not unit at a time, then we need to create the call graph
482
     now, so that called functions can be queued and emitted now.  */
483
  if (!flag_unit_at_a_time)
484
    {
485
      cgraph_analyze_function (node);
486
      cgraph_decide_inlining_incrementally (node, false);
487
    }
488
 
489
  if (decide_is_function_needed (node, decl))
490
    cgraph_mark_needed_node (node);
491
 
492
  /* Since we reclaim unreachable nodes at the end of every language
493
     level unit, we need to be conservative about possible entry points
494
     there.  */
495
  if ((TREE_PUBLIC (decl) && !DECL_COMDAT (decl) && !DECL_EXTERNAL (decl)))
496
    cgraph_mark_reachable_node (node);
497
 
498
  /* If not unit at a time, go ahead and emit everything we've found
499
     to be reachable at this time.  */
500
  if (!nested)
501
    {
502
      if (!cgraph_assemble_pending_functions ())
503
        ggc_collect ();
504
    }
505
 
506
  /* If we've not yet emitted decl, tell the debug info about it.  */
507
  if (!TREE_ASM_WRITTEN (decl))
508
    (*debug_hooks->deferred_inline_function) (decl);
509
 
510
  /* Possibly warn about unused parameters.  */
511
  if (warn_unused_parameter)
512
    do_warn_unused_parameter (decl);
513
}
514
 
515
/* Walk tree and record all calls.  Called via walk_tree.  */
516
static tree
517
record_reference (tree *tp, int *walk_subtrees, void *data)
518
{
519
  tree t = *tp;
520
 
521
  switch (TREE_CODE (t))
522
    {
523
    case VAR_DECL:
524
      /* ??? Really, we should mark this decl as *potentially* referenced
525
         by this function and re-examine whether the decl is actually used
526
         after rtl has been generated.  */
527
      if (TREE_STATIC (t) || DECL_EXTERNAL (t))
528
        {
529
          cgraph_varpool_mark_needed_node (cgraph_varpool_node (t));
530
          if (lang_hooks.callgraph.analyze_expr)
531
            return lang_hooks.callgraph.analyze_expr (tp, walk_subtrees,
532
                                                      data);
533
        }
534
      break;
535
 
536
    case FDESC_EXPR:
537
    case ADDR_EXPR:
538
      if (flag_unit_at_a_time)
539
        {
540
          /* Record dereferences to the functions.  This makes the
541
             functions reachable unconditionally.  */
542
          tree decl = TREE_OPERAND (*tp, 0);
543
          if (TREE_CODE (decl) == FUNCTION_DECL)
544
            cgraph_mark_needed_node (cgraph_node (decl));
545
        }
546
      break;
547
 
548
    default:
549
      /* Save some cycles by not walking types and declaration as we
550
         won't find anything useful there anyway.  */
551
      if (IS_TYPE_OR_DECL_P (*tp))
552
        {
553
          *walk_subtrees = 0;
554
          break;
555
        }
556
 
557
      if ((unsigned int) TREE_CODE (t) >= LAST_AND_UNUSED_TREE_CODE)
558
        return lang_hooks.callgraph.analyze_expr (tp, walk_subtrees, data);
559
      break;
560
    }
561
 
562
  return NULL;
563
}
564
 
565
/* Create cgraph edges for function calls inside BODY from NODE.  */
566
 
567
static void
568
cgraph_create_edges (struct cgraph_node *node, tree body)
569
{
570
  basic_block bb;
571
 
572
  struct function *this_cfun = DECL_STRUCT_FUNCTION (body);
573
  block_stmt_iterator bsi;
574
  tree step;
575
  visited_nodes = pointer_set_create ();
576
 
577
  /* Reach the trees by walking over the CFG, and note the
578
     enclosing basic-blocks in the call edges.  */
579
  FOR_EACH_BB_FN (bb, this_cfun)
580
    for (bsi = bsi_start (bb); !bsi_end_p (bsi); bsi_next (&bsi))
581
      {
582
        tree stmt = bsi_stmt (bsi);
583
        tree call = get_call_expr_in (stmt);
584
        tree decl;
585
 
586
        if (call && (decl = get_callee_fndecl (call)))
587
          {
588
            cgraph_create_edge (node, cgraph_node (decl), stmt,
589
                                bb->count,
590
                                bb->loop_depth);
591
            walk_tree (&TREE_OPERAND (call, 1),
592
                       record_reference, node, visited_nodes);
593
            if (TREE_CODE (stmt) == MODIFY_EXPR)
594
              walk_tree (&TREE_OPERAND (stmt, 0),
595
                         record_reference, node, visited_nodes);
596
          }
597
        else
598
          walk_tree (bsi_stmt_ptr (bsi), record_reference, node, visited_nodes);
599
      }
600
 
601
  /* Look for initializers of constant variables and private statics.  */
602
  for (step = DECL_STRUCT_FUNCTION (body)->unexpanded_var_list;
603
       step;
604
       step = TREE_CHAIN (step))
605
    {
606
      tree decl = TREE_VALUE (step);
607
      if (TREE_CODE (decl) == VAR_DECL
608
          && (TREE_STATIC (decl) && !DECL_EXTERNAL (decl))
609
          && flag_unit_at_a_time)
610
        cgraph_varpool_finalize_decl (decl);
611
      else if (TREE_CODE (decl) == VAR_DECL && DECL_INITIAL (decl))
612
        walk_tree (&DECL_INITIAL (decl), record_reference, node, visited_nodes);
613
    }
614
 
615
  pointer_set_destroy (visited_nodes);
616
  visited_nodes = NULL;
617
}
618
 
619
/* Give initial reasons why inlining would fail.  Those gets
620
   either NULLified or usually overwritten by more precise reason
621
   later.  */
622
static void
623
initialize_inline_failed (struct cgraph_node *node)
624
{
625
  struct cgraph_edge *e;
626
 
627
  for (e = node->callers; e; e = e->next_caller)
628
    {
629
      gcc_assert (!e->callee->global.inlined_to);
630
      gcc_assert (e->inline_failed);
631
      if (node->local.redefined_extern_inline)
632
        e->inline_failed = N_("redefined extern inline functions are not "
633
                           "considered for inlining");
634
      else if (!node->local.inlinable)
635
        e->inline_failed = N_("function not inlinable");
636
      else
637
        e->inline_failed = N_("function not considered for inlining");
638
    }
639
}
640
 
641
/* Rebuild call edges from current function after a passes not aware
642
   of cgraph updating.  */
643
static unsigned int
644
rebuild_cgraph_edges (void)
645
{
646
  basic_block bb;
647
  struct cgraph_node *node = cgraph_node (current_function_decl);
648
  block_stmt_iterator bsi;
649
 
650
  cgraph_node_remove_callees (node);
651
 
652
  node->count = ENTRY_BLOCK_PTR->count;
653
 
654
  FOR_EACH_BB (bb)
655
    for (bsi = bsi_start (bb); !bsi_end_p (bsi); bsi_next (&bsi))
656
      {
657
        tree stmt = bsi_stmt (bsi);
658
        tree call = get_call_expr_in (stmt);
659
        tree decl;
660
 
661
        if (call && (decl = get_callee_fndecl (call)))
662
          cgraph_create_edge (node, cgraph_node (decl), stmt,
663
                              bb->count,
664
                              bb->loop_depth);
665
      }
666
  initialize_inline_failed (node);
667
  gcc_assert (!node->global.inlined_to);
668
  return 0;
669
}
670
 
671
struct tree_opt_pass pass_rebuild_cgraph_edges =
672
{
673
  NULL,                                 /* name */
674
  NULL,                                 /* gate */
675
  rebuild_cgraph_edges,                 /* execute */
676
  NULL,                                 /* sub */
677
  NULL,                                 /* next */
678
  0,                                     /* static_pass_number */
679
  0,                                     /* tv_id */
680
  PROP_cfg,                             /* properties_required */
681
  0,                                     /* properties_provided */
682
  0,                                     /* properties_destroyed */
683
  0,                                     /* todo_flags_start */
684
  0,                                     /* todo_flags_finish */
685
 
686
};
687
 
688
/* Verify cgraph nodes of given cgraph node.  */
689
void
690
verify_cgraph_node (struct cgraph_node *node)
691
{
692
  struct cgraph_edge *e;
693
  struct cgraph_node *main_clone;
694
  struct function *this_cfun = DECL_STRUCT_FUNCTION (node->decl);
695
  basic_block this_block;
696
  block_stmt_iterator bsi;
697
  bool error_found = false;
698
 
699
  if (errorcount || sorrycount)
700
    return;
701
 
702
  timevar_push (TV_CGRAPH_VERIFY);
703
  for (e = node->callees; e; e = e->next_callee)
704
    if (e->aux)
705
      {
706
        error ("aux field set for edge %s->%s",
707
               cgraph_node_name (e->caller), cgraph_node_name (e->callee));
708
        error_found = true;
709
      }
710
  if (node->count < 0)
711
    {
712
      error ("Execution count is negative");
713
      error_found = true;
714
    }
715
  for (e = node->callers; e; e = e->next_caller)
716
    {
717
      if (e->count < 0)
718
        {
719
          error ("caller edge count is negative");
720
          error_found = true;
721
        }
722
      if (!e->inline_failed)
723
        {
724
          if (node->global.inlined_to
725
              != (e->caller->global.inlined_to
726
                  ? e->caller->global.inlined_to : e->caller))
727
            {
728
              error ("inlined_to pointer is wrong");
729
              error_found = true;
730
            }
731
          if (node->callers->next_caller)
732
            {
733
              error ("multiple inline callers");
734
              error_found = true;
735
            }
736
        }
737
      else
738
        if (node->global.inlined_to)
739
          {
740
            error ("inlined_to pointer set for noninline callers");
741
            error_found = true;
742
          }
743
    }
744
  if (!node->callers && node->global.inlined_to)
745
    {
746
      error ("inlined_to pointer is set but no predecessors found");
747
      error_found = true;
748
    }
749
  if (node->global.inlined_to == node)
750
    {
751
      error ("inlined_to pointer refers to itself");
752
      error_found = true;
753
    }
754
 
755
  for (main_clone = cgraph_node (node->decl); main_clone;
756
       main_clone = main_clone->next_clone)
757
    if (main_clone == node)
758
      break;
759
  if (!cgraph_node (node->decl))
760
    {
761
      error ("node not found in cgraph_hash");
762
      error_found = true;
763
    }
764
 
765
  if (node->analyzed
766
      && DECL_SAVED_TREE (node->decl) && !TREE_ASM_WRITTEN (node->decl)
767
      && (!DECL_EXTERNAL (node->decl) || node->global.inlined_to))
768
    {
769
      if (this_cfun->cfg)
770
        {
771
          /* The nodes we're interested in are never shared, so walk
772
             the tree ignoring duplicates.  */
773
          visited_nodes = pointer_set_create ();
774
          /* Reach the trees by walking over the CFG, and note the
775
             enclosing basic-blocks in the call edges.  */
776
          FOR_EACH_BB_FN (this_block, this_cfun)
777
            for (bsi = bsi_start (this_block); !bsi_end_p (bsi); bsi_next (&bsi))
778
              {
779
                tree stmt = bsi_stmt (bsi);
780
                tree call = get_call_expr_in (stmt);
781
                tree decl;
782
                if (call && (decl = get_callee_fndecl (call)))
783
                  {
784
                    struct cgraph_edge *e = cgraph_edge (node, stmt);
785
                    if (e)
786
                      {
787
                        if (e->aux)
788
                          {
789
                            error ("shared call_stmt:");
790
                            debug_generic_stmt (stmt);
791
                            error_found = true;
792
                          }
793
                        if (e->callee->decl != cgraph_node (decl)->decl
794
                            && e->inline_failed)
795
                          {
796
                            error ("edge points to wrong declaration:");
797
                            debug_tree (e->callee->decl);
798
                            fprintf (stderr," Instead of:");
799
                            debug_tree (decl);
800
                          }
801
                        e->aux = (void *)1;
802
                      }
803
                    else
804
                      {
805
                        error ("missing callgraph edge for call stmt:");
806
                        debug_generic_stmt (stmt);
807
                        error_found = true;
808
                      }
809
                  }
810
              }
811
          pointer_set_destroy (visited_nodes);
812
          visited_nodes = NULL;
813
        }
814
      else
815
        /* No CFG available?!  */
816
        gcc_unreachable ();
817
 
818
      for (e = node->callees; e; e = e->next_callee)
819
        {
820
          if (!e->aux)
821
            {
822
              error ("edge %s->%s has no corresponding call_stmt",
823
                     cgraph_node_name (e->caller),
824
                     cgraph_node_name (e->callee));
825
              debug_generic_stmt (e->call_stmt);
826
              error_found = true;
827
            }
828
          e->aux = 0;
829
        }
830
    }
831
  if (error_found)
832
    {
833
      dump_cgraph_node (stderr, node);
834
      internal_error ("verify_cgraph_node failed");
835
    }
836
  timevar_pop (TV_CGRAPH_VERIFY);
837
}
838
 
839
/* Verify whole cgraph structure.  */
840
void
841
verify_cgraph (void)
842
{
843
  struct cgraph_node *node;
844
 
845
  if (sorrycount || errorcount)
846
    return;
847
 
848
  for (node = cgraph_nodes; node; node = node->next)
849
    verify_cgraph_node (node);
850
}
851
 
852
/* Output one variable, if necessary.  Return whether we output it.  */
853
static bool
854
cgraph_varpool_assemble_decl (struct cgraph_varpool_node *node)
855
{
856
  tree decl = node->decl;
857
 
858
  if (!TREE_ASM_WRITTEN (decl)
859
      && !node->alias
860
      && !DECL_EXTERNAL (decl)
861
      && (TREE_CODE (decl) != VAR_DECL || !DECL_HAS_VALUE_EXPR_P (decl)))
862
    {
863
      assemble_variable (decl, 0, 1, 0);
864
      return TREE_ASM_WRITTEN (decl);
865
    }
866
 
867
  return false;
868
}
869
 
870
/* Output all variables enqueued to be assembled.  */
871
bool
872
cgraph_varpool_assemble_pending_decls (void)
873
{
874
  bool changed = false;
875
 
876
  if (errorcount || sorrycount)
877
    return false;
878
 
879
  /* EH might mark decls as needed during expansion.  This should be safe since
880
     we don't create references to new function, but it should not be used
881
     elsewhere.  */
882
  cgraph_varpool_analyze_pending_decls ();
883
 
884
  while (cgraph_varpool_nodes_queue)
885
    {
886
      struct cgraph_varpool_node *node = cgraph_varpool_nodes_queue;
887
 
888
      cgraph_varpool_nodes_queue = cgraph_varpool_nodes_queue->next_needed;
889
      if (cgraph_varpool_assemble_decl (node))
890
        {
891
          changed = true;
892
          node->next_needed = cgraph_varpool_assembled_nodes_queue;
893
          cgraph_varpool_assembled_nodes_queue = node;
894
          node->finalized = 1;
895
        }
896
      else
897
        node->next_needed = NULL;
898
    }
899
  /* cgraph_varpool_nodes_queue is now empty, clear the pointer to the last
900
     element in the queue.  */
901
  cgraph_varpool_last_needed_node = NULL;
902
  return changed;
903
}
904
/* Output all variables enqueued to be assembled.  */
905
static void
906
cgraph_varpool_output_debug_info (void)
907
{
908
  timevar_push (TV_SYMOUT);
909
  if (errorcount == 0 && sorrycount == 0)
910
    while (cgraph_varpool_assembled_nodes_queue)
911
      {
912
        struct cgraph_varpool_node *node = cgraph_varpool_assembled_nodes_queue;
913
 
914
        /* Local static variables are never seen by check_global_declarations
915
           so we need to output debug info by hand.  */
916
        if (DECL_CONTEXT (node->decl)
917
            && (TREE_CODE (DECL_CONTEXT (node->decl)) == BLOCK
918
                || TREE_CODE (DECL_CONTEXT (node->decl)) == FUNCTION_DECL)
919
            && errorcount == 0 && sorrycount == 0)
920
             (*debug_hooks->global_decl) (node->decl);
921
        cgraph_varpool_assembled_nodes_queue = node->next_needed;
922
        node->next_needed = 0;
923
      }
924
  timevar_pop (TV_SYMOUT);
925
}
926
 
927
/* Output all asm statements we have stored up to be output.  */
928
 
929
static void
930
cgraph_output_pending_asms (void)
931
{
932
  struct cgraph_asm_node *can;
933
 
934
  if (errorcount || sorrycount)
935
    return;
936
 
937
  for (can = cgraph_asm_nodes; can; can = can->next)
938
    assemble_asm (can->asm_str);
939
  cgraph_asm_nodes = NULL;
940
}
941
 
942
/* Analyze the function scheduled to be output.  */
943
void
944
cgraph_analyze_function (struct cgraph_node *node)
945
{
946
  tree decl = node->decl;
947
 
948
  current_function_decl = decl;
949
  push_cfun (DECL_STRUCT_FUNCTION (decl));
950
  cgraph_lower_function (node);
951
 
952
  /* First kill forward declaration so reverse inlining works properly.  */
953
  cgraph_create_edges (node, decl);
954
 
955
  node->local.inlinable = tree_inlinable_function_p (decl);
956
  if (!flag_unit_at_a_time)
957
    node->local.self_insns = estimate_num_insns (decl);
958
  if (node->local.inlinable)
959
    node->local.disregard_inline_limits
960
      = lang_hooks.tree_inlining.disregard_inline_limits (decl);
961
  initialize_inline_failed (node);
962
  if (flag_really_no_inline && !node->local.disregard_inline_limits)
963
    node->local.inlinable = 0;
964
  /* Inlining characteristics are maintained by the cgraph_mark_inline.  */
965
  node->global.insns = node->local.self_insns;
966
 
967
  node->analyzed = true;
968
  pop_cfun ();
969
  current_function_decl = NULL;
970
}
971
 
972
/* Look for externally_visible and used attributes and mark cgraph nodes
973
   accordingly.
974
 
975
   We cannot mark the nodes at the point the attributes are processed (in
976
   handle_*_attribute) because the copy of the declarations available at that
977
   point may not be canonical.  For example, in:
978
 
979
    void f();
980
    void f() __attribute__((used));
981
 
982
   the declaration we see in handle_used_attribute will be the second
983
   declaration -- but the front end will subsequently merge that declaration
984
   with the original declaration and discard the second declaration.
985
 
986
   Furthermore, we can't mark these nodes in cgraph_finalize_function because:
987
 
988
    void f() {}
989
    void f() __attribute__((externally_visible));
990
 
991
   is valid.
992
 
993
   So, we walk the nodes at the end of the translation unit, applying the
994
   attributes at that point.  */
995
 
996
static void
997
process_function_and_variable_attributes (struct cgraph_node *first,
998
                                          struct cgraph_varpool_node *first_var)
999
{
1000
  struct cgraph_node *node;
1001
  struct cgraph_varpool_node *vnode;
1002
 
1003
  for (node = cgraph_nodes; node != first; node = node->next)
1004
    {
1005
      tree decl = node->decl;
1006
      if (lookup_attribute ("used", DECL_ATTRIBUTES (decl)))
1007
        {
1008
          mark_decl_referenced (decl);
1009
          if (node->local.finalized)
1010
             cgraph_mark_needed_node (node);
1011
        }
1012
      if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
1013
        {
1014
          if (! TREE_PUBLIC (node->decl))
1015
            warning (OPT_Wattributes,
1016
                     "%J%<externally_visible%> attribute have effect only on public objects",
1017
                     node->decl);
1018
          else
1019
            {
1020
              if (node->local.finalized)
1021
                cgraph_mark_needed_node (node);
1022
              node->local.externally_visible = true;
1023
            }
1024
        }
1025
    }
1026
  for (vnode = cgraph_varpool_nodes; vnode != first_var; vnode = vnode->next)
1027
    {
1028
      tree decl = vnode->decl;
1029
      if (lookup_attribute ("used", DECL_ATTRIBUTES (decl)))
1030
        {
1031
          mark_decl_referenced (decl);
1032
          if (vnode->finalized)
1033
            cgraph_varpool_mark_needed_node (vnode);
1034
        }
1035
      if (lookup_attribute ("externally_visible", DECL_ATTRIBUTES (decl)))
1036
        {
1037
          if (! TREE_PUBLIC (vnode->decl))
1038
            warning (OPT_Wattributes,
1039
                     "%J%<externally_visible%> attribute have effect only on public objects",
1040
                     vnode->decl);
1041
          else
1042
            {
1043
              if (vnode->finalized)
1044
                cgraph_varpool_mark_needed_node (vnode);
1045
              vnode->externally_visible = true;
1046
            }
1047
        }
1048
    }
1049
}
1050
 
1051
/* Analyze the whole compilation unit once it is parsed completely.  */
1052
 
1053
void
1054
cgraph_finalize_compilation_unit (void)
1055
{
1056
  struct cgraph_node *node, *next;
1057
  /* Keep track of already processed nodes when called multiple times for
1058
     intermodule optimization.  */
1059
  static struct cgraph_node *first_analyzed;
1060
  struct cgraph_node *first_processed = first_analyzed;
1061
  static struct cgraph_varpool_node *first_analyzed_var;
1062
 
1063
  if (errorcount || sorrycount)
1064
    return;
1065
 
1066
  finish_aliases_1 ();
1067
 
1068
  if (!flag_unit_at_a_time)
1069
    {
1070
      cgraph_output_pending_asms ();
1071
      cgraph_assemble_pending_functions ();
1072
      cgraph_varpool_output_debug_info ();
1073
      return;
1074
    }
1075
 
1076
  if (!quiet_flag)
1077
    {
1078
      fprintf (stderr, "\nAnalyzing compilation unit");
1079
      fflush (stderr);
1080
    }
1081
 
1082
  timevar_push (TV_CGRAPH);
1083
  process_function_and_variable_attributes (first_processed,
1084
                                            first_analyzed_var);
1085
  first_processed = cgraph_nodes;
1086
  first_analyzed_var = cgraph_varpool_nodes;
1087
  cgraph_varpool_analyze_pending_decls ();
1088
  if (cgraph_dump_file)
1089
    {
1090
      fprintf (cgraph_dump_file, "Initial entry points:");
1091
      for (node = cgraph_nodes; node != first_analyzed; node = node->next)
1092
        if (node->needed && DECL_SAVED_TREE (node->decl))
1093
          fprintf (cgraph_dump_file, " %s", cgraph_node_name (node));
1094
      fprintf (cgraph_dump_file, "\n");
1095
    }
1096
 
1097
  /* Propagate reachability flag and lower representation of all reachable
1098
     functions.  In the future, lowering will introduce new functions and
1099
     new entry points on the way (by template instantiation and virtual
1100
     method table generation for instance).  */
1101
  while (cgraph_nodes_queue)
1102
    {
1103
      struct cgraph_edge *edge;
1104
      tree decl = cgraph_nodes_queue->decl;
1105
 
1106
      node = cgraph_nodes_queue;
1107
      cgraph_nodes_queue = cgraph_nodes_queue->next_needed;
1108
      node->next_needed = NULL;
1109
 
1110
      /* ??? It is possible to create extern inline function and later using
1111
         weak alias attribute to kill its body. See
1112
         gcc.c-torture/compile/20011119-1.c  */
1113
      if (!DECL_SAVED_TREE (decl))
1114
        {
1115
          cgraph_reset_node (node);
1116
          continue;
1117
        }
1118
 
1119
      gcc_assert (!node->analyzed && node->reachable);
1120
      gcc_assert (DECL_SAVED_TREE (decl));
1121
 
1122
      cgraph_analyze_function (node);
1123
 
1124
      for (edge = node->callees; edge; edge = edge->next_callee)
1125
        if (!edge->callee->reachable)
1126
          cgraph_mark_reachable_node (edge->callee);
1127
 
1128
      /* We finalize local static variables during constructing callgraph
1129
         edges.  Process their attributes too.  */
1130
      process_function_and_variable_attributes (first_processed,
1131
                                                first_analyzed_var);
1132
      first_processed = cgraph_nodes;
1133
      first_analyzed_var = cgraph_varpool_nodes;
1134
      cgraph_varpool_analyze_pending_decls ();
1135
    }
1136
 
1137
  /* Collect entry points to the unit.  */
1138
  if (cgraph_dump_file)
1139
    {
1140
      fprintf (cgraph_dump_file, "Unit entry points:");
1141
      for (node = cgraph_nodes; node != first_analyzed; node = node->next)
1142
        if (node->needed && DECL_SAVED_TREE (node->decl))
1143
          fprintf (cgraph_dump_file, " %s", cgraph_node_name (node));
1144
      fprintf (cgraph_dump_file, "\n\nInitial ");
1145
      dump_cgraph (cgraph_dump_file);
1146
    }
1147
 
1148
  if (cgraph_dump_file)
1149
    fprintf (cgraph_dump_file, "\nReclaiming functions:");
1150
 
1151
  for (node = cgraph_nodes; node != first_analyzed; node = next)
1152
    {
1153
      tree decl = node->decl;
1154
      next = node->next;
1155
 
1156
      if (node->local.finalized && !DECL_SAVED_TREE (decl))
1157
        cgraph_reset_node (node);
1158
 
1159
      if (!node->reachable && DECL_SAVED_TREE (decl))
1160
        {
1161
          if (cgraph_dump_file)
1162
            fprintf (cgraph_dump_file, " %s", cgraph_node_name (node));
1163
          cgraph_remove_node (node);
1164
          continue;
1165
        }
1166
      else
1167
        node->next_needed = NULL;
1168
      gcc_assert (!node->local.finalized || DECL_SAVED_TREE (decl));
1169
      gcc_assert (node->analyzed == node->local.finalized);
1170
    }
1171
  if (cgraph_dump_file)
1172
    {
1173
      fprintf (cgraph_dump_file, "\n\nReclaimed ");
1174
      dump_cgraph (cgraph_dump_file);
1175
    }
1176
  first_analyzed = cgraph_nodes;
1177
  ggc_collect ();
1178
  timevar_pop (TV_CGRAPH);
1179
}
1180
/* Figure out what functions we want to assemble.  */
1181
 
1182
static void
1183
cgraph_mark_functions_to_output (void)
1184
{
1185
  struct cgraph_node *node;
1186
 
1187
  for (node = cgraph_nodes; node; node = node->next)
1188
    {
1189
      tree decl = node->decl;
1190
      struct cgraph_edge *e;
1191
 
1192
      gcc_assert (!node->output);
1193
 
1194
      for (e = node->callers; e; e = e->next_caller)
1195
        if (e->inline_failed)
1196
          break;
1197
 
1198
      /* We need to output all local functions that are used and not
1199
         always inlined, as well as those that are reachable from
1200
         outside the current compilation unit.  */
1201
      if (DECL_SAVED_TREE (decl)
1202
          && !node->global.inlined_to
1203
          && (node->needed
1204
              || (e && node->reachable))
1205
          && !TREE_ASM_WRITTEN (decl)
1206
          && !DECL_EXTERNAL (decl))
1207
        node->output = 1;
1208
      else
1209
        {
1210
          /* We should've reclaimed all functions that are not needed.  */
1211
#ifdef ENABLE_CHECKING
1212
          if (!node->global.inlined_to && DECL_SAVED_TREE (decl)
1213
              && !DECL_EXTERNAL (decl))
1214
            {
1215
              dump_cgraph_node (stderr, node);
1216
              internal_error ("failed to reclaim unneeded function");
1217
            }
1218
#endif
1219
          gcc_assert (node->global.inlined_to || !DECL_SAVED_TREE (decl)
1220
                      || DECL_EXTERNAL (decl));
1221
 
1222
        }
1223
 
1224
    }
1225
}
1226
 
1227
/* Expand function specified by NODE.  */
1228
 
1229
static void
1230
cgraph_expand_function (struct cgraph_node *node)
1231
{
1232
  tree decl = node->decl;
1233
 
1234
  /* We ought to not compile any inline clones.  */
1235
  gcc_assert (!node->global.inlined_to);
1236
 
1237
  if (flag_unit_at_a_time)
1238
    announce_function (decl);
1239
 
1240
  cgraph_lower_function (node);
1241
 
1242
  /* Generate RTL for the body of DECL.  */
1243
  lang_hooks.callgraph.expand_function (decl);
1244
 
1245
  /* Make sure that BE didn't give up on compiling.  */
1246
  /* ??? Can happen with nested function of extern inline.  */
1247
  gcc_assert (TREE_ASM_WRITTEN (node->decl));
1248
 
1249
  current_function_decl = NULL;
1250
  if (!cgraph_preserve_function_body_p (node->decl))
1251
    {
1252
      DECL_SAVED_TREE (node->decl) = NULL;
1253
      DECL_STRUCT_FUNCTION (node->decl) = NULL;
1254
      DECL_INITIAL (node->decl) = error_mark_node;
1255
      /* Eliminate all call edges.  This is important so the call_expr no longer
1256
         points to the dead function body.  */
1257
      cgraph_node_remove_callees (node);
1258
    }
1259
 
1260
  cgraph_function_flags_ready = true;
1261
}
1262
 
1263
/* Return true when CALLER_DECL should be inlined into CALLEE_DECL.  */
1264
 
1265
bool
1266
cgraph_inline_p (struct cgraph_edge *e, const char **reason)
1267
{
1268
  *reason = e->inline_failed;
1269
  return !e->inline_failed;
1270
}
1271
 
1272
 
1273
 
1274
/* Expand all functions that must be output.
1275
 
1276
   Attempt to topologically sort the nodes so function is output when
1277
   all called functions are already assembled to allow data to be
1278
   propagated across the callgraph.  Use a stack to get smaller distance
1279
   between a function and its callees (later we may choose to use a more
1280
   sophisticated algorithm for function reordering; we will likely want
1281
   to use subsections to make the output functions appear in top-down
1282
   order).  */
1283
 
1284
static void
1285
cgraph_expand_all_functions (void)
1286
{
1287
  struct cgraph_node *node;
1288
  struct cgraph_node **order = XCNEWVEC (struct cgraph_node *, cgraph_n_nodes);
1289
  int order_pos = 0, new_order_pos = 0;
1290
  int i;
1291
 
1292
  order_pos = cgraph_postorder (order);
1293
  gcc_assert (order_pos == cgraph_n_nodes);
1294
 
1295
  /* Garbage collector may remove inline clones we eliminate during
1296
     optimization.  So we must be sure to not reference them.  */
1297
  for (i = 0; i < order_pos; i++)
1298
    if (order[i]->output)
1299
      order[new_order_pos++] = order[i];
1300
 
1301
  for (i = new_order_pos - 1; i >= 0; i--)
1302
    {
1303
      node = order[i];
1304
      if (node->output)
1305
        {
1306
          gcc_assert (node->reachable);
1307
          node->output = 0;
1308
          cgraph_expand_function (node);
1309
        }
1310
    }
1311
 
1312
  free (order);
1313
 
1314
  /* Process CGRAPH_EXPAND_QUEUE, these are functions created during
1315
     the expansion process.  Note that this queue may grow as its
1316
     being processed, as the new functions may generate new ones.  */
1317
  while (cgraph_expand_queue)
1318
    {
1319
      node = cgraph_expand_queue;
1320
      cgraph_expand_queue = cgraph_expand_queue->next_needed;
1321
      node->next_needed = NULL;
1322
      node->output = 0;
1323
      node->lowered = DECL_STRUCT_FUNCTION (node->decl)->cfg != NULL;
1324
      cgraph_expand_function (node);
1325
    }
1326
}
1327
 
1328
/* This is used to sort the node types by the cgraph order number.  */
1329
 
1330
struct cgraph_order_sort
1331
{
1332
  enum { ORDER_UNDEFINED = 0, ORDER_FUNCTION, ORDER_VAR, ORDER_ASM } kind;
1333
  union
1334
  {
1335
    struct cgraph_node *f;
1336
    struct cgraph_varpool_node *v;
1337
    struct cgraph_asm_node *a;
1338
  } u;
1339
};
1340
 
1341
/* Output all functions, variables, and asm statements in the order
1342
   according to their order fields, which is the order in which they
1343
   appeared in the file.  This implements -fno-toplevel-reorder.  In
1344
   this mode we may output functions and variables which don't really
1345
   need to be output.  */
1346
 
1347
static void
1348
cgraph_output_in_order (void)
1349
{
1350
  int max;
1351
  size_t size;
1352
  struct cgraph_order_sort *nodes;
1353
  int i;
1354
  struct cgraph_node *pf;
1355
  struct cgraph_varpool_node *pv;
1356
  struct cgraph_asm_node *pa;
1357
 
1358
  max = cgraph_order;
1359
  size = max * sizeof (struct cgraph_order_sort);
1360
  nodes = (struct cgraph_order_sort *) alloca (size);
1361
  memset (nodes, 0, size);
1362
 
1363
  cgraph_varpool_analyze_pending_decls ();
1364
 
1365
  for (pf = cgraph_nodes; pf; pf = pf->next)
1366
    {
1367
      if (pf->output)
1368
        {
1369
          i = pf->order;
1370
          gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
1371
          nodes[i].kind = ORDER_FUNCTION;
1372
          nodes[i].u.f = pf;
1373
        }
1374
    }
1375
 
1376
  for (pv = cgraph_varpool_nodes_queue; pv; pv = pv->next_needed)
1377
    {
1378
      i = pv->order;
1379
      gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
1380
      nodes[i].kind = ORDER_VAR;
1381
      nodes[i].u.v = pv;
1382
    }
1383
 
1384
  for (pa = cgraph_asm_nodes; pa; pa = pa->next)
1385
    {
1386
      i = pa->order;
1387
      gcc_assert (nodes[i].kind == ORDER_UNDEFINED);
1388
      nodes[i].kind = ORDER_ASM;
1389
      nodes[i].u.a = pa;
1390
    }
1391
 
1392
  for (i = 0; i < max; ++i)
1393
    {
1394
      switch (nodes[i].kind)
1395
        {
1396
        case ORDER_FUNCTION:
1397
          nodes[i].u.f->output = 0;
1398
          cgraph_expand_function (nodes[i].u.f);
1399
          break;
1400
 
1401
        case ORDER_VAR:
1402
          cgraph_varpool_assemble_decl (nodes[i].u.v);
1403
          break;
1404
 
1405
        case ORDER_ASM:
1406
          assemble_asm (nodes[i].u.a->asm_str);
1407
          break;
1408
 
1409
        case ORDER_UNDEFINED:
1410
          break;
1411
 
1412
        default:
1413
          gcc_unreachable ();
1414
        }
1415
    }
1416
 
1417
  cgraph_asm_nodes = NULL;
1418
}
1419
 
1420
/* Mark visibility of all functions.
1421
 
1422
   A local function is one whose calls can occur only in the current
1423
   compilation unit and all its calls are explicit, so we can change
1424
   its calling convention.  We simply mark all static functions whose
1425
   address is not taken as local.
1426
 
1427
   We also change the TREE_PUBLIC flag of all declarations that are public
1428
   in language point of view but we want to overwrite this default
1429
   via visibilities for the backend point of view.  */
1430
 
1431
static void
1432
cgraph_function_and_variable_visibility (void)
1433
{
1434
  struct cgraph_node *node;
1435
  struct cgraph_varpool_node *vnode;
1436
 
1437
  for (node = cgraph_nodes; node; node = node->next)
1438
    {
1439
      if (node->reachable
1440
          && (DECL_COMDAT (node->decl)
1441
              || (!flag_whole_program
1442
                  && TREE_PUBLIC (node->decl) && !DECL_EXTERNAL (node->decl))))
1443
        node->local.externally_visible = true;
1444
      if (!node->local.externally_visible && node->analyzed
1445
          && !DECL_EXTERNAL (node->decl))
1446
        {
1447
          gcc_assert (flag_whole_program || !TREE_PUBLIC (node->decl));
1448
          TREE_PUBLIC (node->decl) = 0;
1449
        }
1450
      node->local.local = (!node->needed
1451
                           && node->analyzed
1452
                           && !DECL_EXTERNAL (node->decl)
1453
                           && !node->local.externally_visible);
1454
    }
1455
  for (vnode = cgraph_varpool_nodes_queue; vnode; vnode = vnode->next_needed)
1456
    {
1457
      if (vnode->needed
1458
          && !flag_whole_program
1459
          && (DECL_COMDAT (vnode->decl) || TREE_PUBLIC (vnode->decl)))
1460
        vnode->externally_visible = 1;
1461
      if (!vnode->externally_visible)
1462
        {
1463
          gcc_assert (flag_whole_program || !TREE_PUBLIC (vnode->decl));
1464
          TREE_PUBLIC (vnode->decl) = 0;
1465
        }
1466
     gcc_assert (TREE_STATIC (vnode->decl));
1467
    }
1468
 
1469
  /* Because we have to be conservative on the boundaries of source
1470
     level units, it is possible that we marked some functions in
1471
     reachable just because they might be used later via external
1472
     linkage, but after making them local they are really unreachable
1473
     now.  */
1474
  cgraph_remove_unreachable_nodes (true, cgraph_dump_file);
1475
 
1476
  if (cgraph_dump_file)
1477
    {
1478
      fprintf (cgraph_dump_file, "\nMarking local functions:");
1479
      for (node = cgraph_nodes; node; node = node->next)
1480
        if (node->local.local)
1481
          fprintf (cgraph_dump_file, " %s", cgraph_node_name (node));
1482
      fprintf (cgraph_dump_file, "\n\n");
1483
      fprintf (cgraph_dump_file, "\nMarking externally visible functions:");
1484
      for (node = cgraph_nodes; node; node = node->next)
1485
        if (node->local.externally_visible)
1486
          fprintf (cgraph_dump_file, " %s", cgraph_node_name (node));
1487
      fprintf (cgraph_dump_file, "\n\n");
1488
    }
1489
  cgraph_function_flags_ready = true;
1490
}
1491
 
1492
/* Return true when function body of DECL still needs to be kept around
1493
   for later re-use.  */
1494
bool
1495
cgraph_preserve_function_body_p (tree decl)
1496
{
1497
  struct cgraph_node *node;
1498
  if (!cgraph_global_info_ready)
1499
    return (flag_really_no_inline
1500
            ? lang_hooks.tree_inlining.disregard_inline_limits (decl)
1501
            : DECL_INLINE (decl));
1502
  /* Look if there is any clone around.  */
1503
  for (node = cgraph_node (decl); node; node = node->next_clone)
1504
    if (node->global.inlined_to)
1505
      return true;
1506
  return false;
1507
}
1508
 
1509
static void
1510
ipa_passes (void)
1511
{
1512
  cfun = NULL;
1513
  tree_register_cfg_hooks ();
1514
  bitmap_obstack_initialize (NULL);
1515
  execute_ipa_pass_list (all_ipa_passes);
1516
  bitmap_obstack_release (NULL);
1517
}
1518
 
1519
/* Perform simple optimizations based on callgraph.  */
1520
 
1521
void
1522
cgraph_optimize (void)
1523
{
1524
  if (errorcount || sorrycount)
1525
    return;
1526
 
1527
#ifdef ENABLE_CHECKING
1528
  verify_cgraph ();
1529
#endif
1530
  if (!flag_unit_at_a_time)
1531
    {
1532
      cgraph_output_pending_asms ();
1533
      cgraph_varpool_assemble_pending_decls ();
1534
      cgraph_varpool_output_debug_info ();
1535
      return;
1536
    }
1537
 
1538
  process_pending_assemble_externals ();
1539
 
1540
  /* Frontend may output common variables after the unit has been finalized.
1541
     It is safe to deal with them here as they are always zero initialized.  */
1542
  cgraph_varpool_analyze_pending_decls ();
1543
 
1544
  timevar_push (TV_CGRAPHOPT);
1545
  if (!quiet_flag)
1546
    fprintf (stderr, "Performing interprocedural optimizations\n");
1547
 
1548
  cgraph_function_and_variable_visibility ();
1549
  if (cgraph_dump_file)
1550
    {
1551
      fprintf (cgraph_dump_file, "Marked ");
1552
      dump_cgraph (cgraph_dump_file);
1553
    }
1554
 
1555
  /* Don't run the IPA passes if there was any error or sorry messages.  */
1556
  if (errorcount == 0 && sorrycount == 0)
1557
    ipa_passes ();
1558
 
1559
  /* This pass remove bodies of extern inline functions we never inlined.
1560
     Do this later so other IPA passes see what is really going on.  */
1561
  cgraph_remove_unreachable_nodes (false, dump_file);
1562
  cgraph_increase_alignment ();
1563
  cgraph_global_info_ready = true;
1564
  if (cgraph_dump_file)
1565
    {
1566
      fprintf (cgraph_dump_file, "Optimized ");
1567
      dump_cgraph (cgraph_dump_file);
1568
      dump_varpool (cgraph_dump_file);
1569
    }
1570
  timevar_pop (TV_CGRAPHOPT);
1571
 
1572
  /* Output everything.  */
1573
  if (!quiet_flag)
1574
    fprintf (stderr, "Assembling functions:\n");
1575
#ifdef ENABLE_CHECKING
1576
  verify_cgraph ();
1577
#endif
1578
 
1579
  cgraph_mark_functions_to_output ();
1580
 
1581
  if (!flag_toplevel_reorder)
1582
    cgraph_output_in_order ();
1583
  else
1584
    {
1585
      cgraph_output_pending_asms ();
1586
 
1587
      cgraph_expand_all_functions ();
1588
      cgraph_varpool_remove_unreferenced_decls ();
1589
 
1590
      cgraph_varpool_assemble_pending_decls ();
1591
      cgraph_varpool_output_debug_info ();
1592
    }
1593
 
1594
  if (cgraph_dump_file)
1595
    {
1596
      fprintf (cgraph_dump_file, "\nFinal ");
1597
      dump_cgraph (cgraph_dump_file);
1598
    }
1599
#ifdef ENABLE_CHECKING
1600
  verify_cgraph ();
1601
  /* Double check that all inline clones are gone and that all
1602
     function bodies have been released from memory.  */
1603
  if (flag_unit_at_a_time
1604
      && !(sorrycount || errorcount))
1605
    {
1606
      struct cgraph_node *node;
1607
      bool error_found = false;
1608
 
1609
      for (node = cgraph_nodes; node; node = node->next)
1610
        if (node->analyzed
1611
            && (node->global.inlined_to
1612
                || DECL_SAVED_TREE (node->decl)))
1613
          {
1614
            error_found = true;
1615
            dump_cgraph_node (stderr, node);
1616
          }
1617
      if (error_found)
1618
        internal_error ("nodes with no released memory found");
1619
    }
1620
#endif
1621
}
1622
 
1623
/* Increase alignment of global arrays to improve vectorization potential.
1624
   TODO:
1625
   - Consider also structs that have an array field.
1626
   - Use ipa analysis to prune arrays that can't be vectorized?
1627
     This should involve global alignment analysis and in the future also
1628
     array padding.  */
1629
 
1630
static void
1631
cgraph_increase_alignment (void)
1632
{
1633
  if (flag_section_anchors && flag_tree_vectorize)
1634
    {
1635
      struct cgraph_varpool_node *vnode;
1636
 
1637
      /* Increase the alignment of all global arrays for vectorization.  */
1638
      for (vnode = cgraph_varpool_nodes_queue;
1639
           vnode;
1640
           vnode = vnode->next_needed)
1641
        {
1642
          tree vectype, decl = vnode->decl;
1643
          unsigned int alignment;
1644
 
1645
          if (TREE_CODE (TREE_TYPE (decl)) != ARRAY_TYPE)
1646
            continue;
1647
          vectype = get_vectype_for_scalar_type (TREE_TYPE (TREE_TYPE (decl)));
1648
          if (!vectype)
1649
            continue;
1650
          alignment = TYPE_ALIGN (vectype);
1651
          if (DECL_ALIGN (decl) >= alignment)
1652
            continue;
1653
 
1654
          if (vect_can_force_dr_alignment_p (decl, alignment))
1655
            {
1656
              DECL_ALIGN (decl) = TYPE_ALIGN (vectype);
1657
              DECL_USER_ALIGN (decl) = 1;
1658
              if (cgraph_dump_file)
1659
                {
1660
                  fprintf (cgraph_dump_file, "Increasing alignment of decl: ");
1661
                  print_generic_expr (cgraph_dump_file, decl, TDF_SLIM);
1662
                }
1663
            }
1664
        }
1665
    }
1666
}
1667
 
1668
/* Generate and emit a static constructor or destructor.  WHICH must be
1669
   one of 'I' or 'D'.  BODY should be a STATEMENT_LIST containing
1670
   GENERIC statements.  */
1671
 
1672
void
1673
cgraph_build_static_cdtor (char which, tree body, int priority)
1674
{
1675
  static int counter = 0;
1676
  char which_buf[16];
1677
  tree decl, name, resdecl;
1678
 
1679
  sprintf (which_buf, "%c_%d", which, counter++);
1680
  name = get_file_function_name_long (which_buf);
1681
 
1682
  decl = build_decl (FUNCTION_DECL, name,
1683
                     build_function_type (void_type_node, void_list_node));
1684
  current_function_decl = decl;
1685
 
1686
  resdecl = build_decl (RESULT_DECL, NULL_TREE, void_type_node);
1687
  DECL_ARTIFICIAL (resdecl) = 1;
1688
  DECL_IGNORED_P (resdecl) = 1;
1689
  DECL_RESULT (decl) = resdecl;
1690
 
1691
  allocate_struct_function (decl);
1692
 
1693
  TREE_STATIC (decl) = 1;
1694
  TREE_USED (decl) = 1;
1695
  DECL_ARTIFICIAL (decl) = 1;
1696
  DECL_IGNORED_P (decl) = 1;
1697
  DECL_NO_INSTRUMENT_FUNCTION_ENTRY_EXIT (decl) = 1;
1698
  DECL_SAVED_TREE (decl) = body;
1699
  TREE_PUBLIC (decl) = ! targetm.have_ctors_dtors;
1700
  DECL_UNINLINABLE (decl) = 1;
1701
 
1702
  DECL_INITIAL (decl) = make_node (BLOCK);
1703
  TREE_USED (DECL_INITIAL (decl)) = 1;
1704
 
1705
  DECL_SOURCE_LOCATION (decl) = input_location;
1706
  cfun->function_end_locus = input_location;
1707
 
1708
  switch (which)
1709
    {
1710
    case 'I':
1711
      DECL_STATIC_CONSTRUCTOR (decl) = 1;
1712
      break;
1713
    case 'D':
1714
      DECL_STATIC_DESTRUCTOR (decl) = 1;
1715
      break;
1716
    default:
1717
      gcc_unreachable ();
1718
    }
1719
 
1720
  gimplify_function_tree (decl);
1721
 
1722
  /* ??? We will get called LATE in the compilation process.  */
1723
  if (cgraph_global_info_ready)
1724
    {
1725
      tree_lowering_passes (decl);
1726
      tree_rest_of_compilation (decl);
1727
    }
1728
  else
1729
    cgraph_finalize_function (decl, 0);
1730
 
1731
  if (targetm.have_ctors_dtors)
1732
    {
1733
      void (*fn) (rtx, int);
1734
 
1735
      if (which == 'I')
1736
        fn = targetm.asm_out.constructor;
1737
      else
1738
        fn = targetm.asm_out.destructor;
1739
      fn (XEXP (DECL_RTL (decl), 0), priority);
1740
    }
1741
}
1742
 
1743
void
1744
init_cgraph (void)
1745
{
1746
  cgraph_dump_file = dump_begin (TDI_cgraph, NULL);
1747
}
1748
 
1749
/* The edges representing the callers of the NEW_VERSION node were
1750
   fixed by cgraph_function_versioning (), now the call_expr in their
1751
   respective tree code should be updated to call the NEW_VERSION.  */
1752
 
1753
static void
1754
update_call_expr (struct cgraph_node *new_version)
1755
{
1756
  struct cgraph_edge *e;
1757
 
1758
  gcc_assert (new_version);
1759
  for (e = new_version->callers; e; e = e->next_caller)
1760
    /* Update the call expr on the edges
1761
       to call the new version.  */
1762
    TREE_OPERAND (TREE_OPERAND (get_call_expr_in (e->call_stmt), 0), 0) = new_version->decl;
1763
}
1764
 
1765
 
1766
/* Create a new cgraph node which is the new version of
1767
   OLD_VERSION node.  REDIRECT_CALLERS holds the callers
1768
   edges which should be redirected to point to
1769
   NEW_VERSION.  ALL the callees edges of OLD_VERSION
1770
   are cloned to the new version node.  Return the new
1771
   version node.  */
1772
 
1773
static struct cgraph_node *
1774
cgraph_copy_node_for_versioning (struct cgraph_node *old_version,
1775
                                 tree new_decl,
1776
                                 VEC(cgraph_edge_p,heap) *redirect_callers)
1777
 {
1778
   struct cgraph_node *new_version;
1779
   struct cgraph_edge *e, *new_e;
1780
   struct cgraph_edge *next_callee;
1781
   unsigned i;
1782
 
1783
   gcc_assert (old_version);
1784
 
1785
   new_version = cgraph_node (new_decl);
1786
 
1787
   new_version->analyzed = true;
1788
   new_version->local = old_version->local;
1789
   new_version->global = old_version->global;
1790
   new_version->rtl = new_version->rtl;
1791
   new_version->reachable = true;
1792
   new_version->count = old_version->count;
1793
 
1794
   /* Clone the old node callees.  Recursive calls are
1795
      also cloned.  */
1796
   for (e = old_version->callees;e; e=e->next_callee)
1797
     {
1798
       new_e = cgraph_clone_edge (e, new_version, e->call_stmt, 0, e->loop_nest, true);
1799
       new_e->count = e->count;
1800
     }
1801
   /* Fix recursive calls.
1802
      If OLD_VERSION has a recursive call after the
1803
      previous edge cloning, the new version will have an edge
1804
      pointing to the old version, which is wrong;
1805
      Redirect it to point to the new version. */
1806
   for (e = new_version->callees ; e; e = next_callee)
1807
     {
1808
       next_callee = e->next_callee;
1809
       if (e->callee == old_version)
1810
         cgraph_redirect_edge_callee (e, new_version);
1811
 
1812
       if (!next_callee)
1813
         break;
1814
     }
1815
   for (i = 0; VEC_iterate (cgraph_edge_p, redirect_callers, i, e); i++)
1816
     {
1817
       /* Redirect calls to the old version node to point to its new
1818
          version.  */
1819
       cgraph_redirect_edge_callee (e, new_version);
1820
     }
1821
 
1822
   return new_version;
1823
 }
1824
 
1825
 /* Perform function versioning.
1826
    Function versioning includes copying of the tree and
1827
    a callgraph update (creating a new cgraph node and updating
1828
    its callees and callers).
1829
 
1830
    REDIRECT_CALLERS varray includes the edges to be redirected
1831
    to the new version.
1832
 
1833
    TREE_MAP is a mapping of tree nodes we want to replace with
1834
    new ones (according to results of prior analysis).
1835
    OLD_VERSION_NODE is the node that is versioned.
1836
    It returns the new version's cgraph node.  */
1837
 
1838
struct cgraph_node *
1839
cgraph_function_versioning (struct cgraph_node *old_version_node,
1840
                            VEC(cgraph_edge_p,heap) *redirect_callers,
1841
                            varray_type tree_map)
1842
{
1843
  tree old_decl = old_version_node->decl;
1844
  struct cgraph_node *new_version_node = NULL;
1845
  tree new_decl;
1846
 
1847
  if (!tree_versionable_function_p (old_decl))
1848
    return NULL;
1849
 
1850
  /* Make a new FUNCTION_DECL tree node for the
1851
     new version. */
1852
  new_decl = copy_node (old_decl);
1853
 
1854
  /* Create the new version's call-graph node.
1855
     and update the edges of the new node. */
1856
  new_version_node =
1857
    cgraph_copy_node_for_versioning (old_version_node, new_decl,
1858
                                     redirect_callers);
1859
 
1860
  /* Copy the OLD_VERSION_NODE function tree to the new version.  */
1861
  tree_function_versioning (old_decl, new_decl, tree_map, false);
1862
  /* Update the call_expr on the edges to call the new version node. */
1863
  update_call_expr (new_version_node);
1864
 
1865
  /* Update the new version's properties.
1866
     Make The new version visible only within this translation unit.
1867
     ??? We cannot use COMDAT linkage because there is no
1868
     ABI support for this.  */
1869
  DECL_EXTERNAL (new_version_node->decl) = 0;
1870
  DECL_ONE_ONLY (new_version_node->decl) = 0;
1871
  TREE_PUBLIC (new_version_node->decl) = 0;
1872
  DECL_COMDAT (new_version_node->decl) = 0;
1873
  new_version_node->local.externally_visible = 0;
1874
  new_version_node->local.local = 1;
1875
  new_version_node->lowered = true;
1876
  return new_version_node;
1877
}
1878
 
1879
/* Produce separate function body for inline clones so the offline copy can be
1880
   modified without affecting them.  */
1881
struct cgraph_node *
1882
save_inline_function_body (struct cgraph_node *node)
1883
{
1884
  struct cgraph_node *first_clone;
1885
 
1886
  gcc_assert (node == cgraph_node (node->decl));
1887
 
1888
  cgraph_lower_function (node);
1889
 
1890
  /* In non-unit-at-a-time we construct full fledged clone we never output to
1891
     assembly file.  This clone is pointed out by inline_decl of original function
1892
     and inlining infrastructure knows how to deal with this.  */
1893
  if (!flag_unit_at_a_time)
1894
    {
1895
      struct cgraph_edge *e;
1896
 
1897
      first_clone = cgraph_clone_node (node, node->count, 0, false);
1898
      first_clone->needed = 0;
1899
      first_clone->reachable = 1;
1900
      /* Recursively clone all bodies.  */
1901
      for (e = first_clone->callees; e; e = e->next_callee)
1902
        if (!e->inline_failed)
1903
          cgraph_clone_inlined_nodes (e, true, false);
1904
    }
1905
  else
1906
    first_clone = node->next_clone;
1907
 
1908
  first_clone->decl = copy_node (node->decl);
1909
  node->next_clone = NULL;
1910
  if (!flag_unit_at_a_time)
1911
    node->inline_decl = first_clone->decl;
1912
  first_clone->prev_clone = NULL;
1913
  cgraph_insert_node_to_hashtable (first_clone);
1914
  gcc_assert (first_clone == cgraph_node (first_clone->decl));
1915
 
1916
  /* Copy the OLD_VERSION_NODE function tree to the new version.  */
1917
  tree_function_versioning (node->decl, first_clone->decl, NULL, true);
1918
 
1919
  DECL_EXTERNAL (first_clone->decl) = 0;
1920
  DECL_ONE_ONLY (first_clone->decl) = 0;
1921
  TREE_PUBLIC (first_clone->decl) = 0;
1922
  DECL_COMDAT (first_clone->decl) = 0;
1923
 
1924
  for (node = first_clone->next_clone; node; node = node->next_clone)
1925
    node->decl = first_clone->decl;
1926
#ifdef ENABLE_CHECKING
1927
  verify_cgraph_node (first_clone);
1928
#endif
1929
  return first_clone;
1930
}
1931
 
1932
#include "gt-cgraphunit.h"

powered by: WebSVN 2.1.0

© copyright 1999-2024 OpenCores.org, equivalent to Oliscience, all rights reserved. OpenCores®, registered trademark.