OpenCores
URL https://opencores.org/ocsvn/openrisc_me/openrisc_me/trunk

Subversion Repositories openrisc_me

[/] [openrisc/] [trunk/] [gnu-src/] [gcc-4.2.2/] [gcc/] [tree-ssa-ccp.c] - Blame information for rev 307

Go to most recent revision | Details | Compare with Previous | View Log

Line No. Rev Author Line
1 38 julius
/* Conditional constant propagation pass for the GNU compiler.
2
   Copyright (C) 2000, 2001, 2002, 2003, 2004, 2005, 2006, 2007
3
   Free Software Foundation, Inc.
4
   Adapted from original RTL SSA-CCP by Daniel Berlin <dberlin@dberlin.org>
5
   Adapted to GIMPLE trees by Diego Novillo <dnovillo@redhat.com>
6
 
7
This file is part of GCC.
8
 
9
GCC is free software; you can redistribute it and/or modify it
10
under the terms of the GNU General Public License as published by the
11
Free Software Foundation; either version 3, or (at your option) any
12
later version.
13
 
14
GCC is distributed in the hope that it will be useful, but WITHOUT
15
ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
16
FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License
17
for more details.
18
 
19
You should have received a copy of the GNU General Public License
20
along with GCC; see the file COPYING3.  If not see
21
<http://www.gnu.org/licenses/>.  */
22
 
23
/* Conditional constant propagation (CCP) is based on the SSA
24
   propagation engine (tree-ssa-propagate.c).  Constant assignments of
25
   the form VAR = CST are propagated from the assignments into uses of
26
   VAR, which in turn may generate new constants.  The simulation uses
27
   a four level lattice to keep track of constant values associated
28
   with SSA names.  Given an SSA name V_i, it may take one of the
29
   following values:
30
 
31
        UNINITIALIZED   ->  This is the default starting value.  V_i
32
                            has not been processed yet.
33
 
34
        UNDEFINED       ->  V_i is a local variable whose definition
35
                            has not been processed yet.  Therefore we
36
                            don't yet know if its value is a constant
37
                            or not.
38
 
39
        CONSTANT        ->  V_i has been found to hold a constant
40
                            value C.
41
 
42
        VARYING         ->  V_i cannot take a constant value, or if it
43
                            does, it is not possible to determine it
44
                            at compile time.
45
 
46
   The core of SSA-CCP is in ccp_visit_stmt and ccp_visit_phi_node:
47
 
48
   1- In ccp_visit_stmt, we are interested in assignments whose RHS
49
      evaluates into a constant and conditional jumps whose predicate
50
      evaluates into a boolean true or false.  When an assignment of
51
      the form V_i = CONST is found, V_i's lattice value is set to
52
      CONSTANT and CONST is associated with it.  This causes the
53
      propagation engine to add all the SSA edges coming out the
54
      assignment into the worklists, so that statements that use V_i
55
      can be visited.
56
 
57
      If the statement is a conditional with a constant predicate, we
58
      mark the outgoing edges as executable or not executable
59
      depending on the predicate's value.  This is then used when
60
      visiting PHI nodes to know when a PHI argument can be ignored.
61
 
62
 
63
   2- In ccp_visit_phi_node, if all the PHI arguments evaluate to the
64
      same constant C, then the LHS of the PHI is set to C.  This
65
      evaluation is known as the "meet operation".  Since one of the
66
      goals of this evaluation is to optimistically return constant
67
      values as often as possible, it uses two main short cuts:
68
 
69
      - If an argument is flowing in through a non-executable edge, it
70
        is ignored.  This is useful in cases like this:
71
 
72
                        if (PRED)
73
                          a_9 = 3;
74
                        else
75
                          a_10 = 100;
76
                        a_11 = PHI (a_9, a_10)
77
 
78
        If PRED is known to always evaluate to false, then we can
79
        assume that a_11 will always take its value from a_10, meaning
80
        that instead of consider it VARYING (a_9 and a_10 have
81
        different values), we can consider it CONSTANT 100.
82
 
83
      - If an argument has an UNDEFINED value, then it does not affect
84
        the outcome of the meet operation.  If a variable V_i has an
85
        UNDEFINED value, it means that either its defining statement
86
        hasn't been visited yet or V_i has no defining statement, in
87
        which case the original symbol 'V' is being used
88
        uninitialized.  Since 'V' is a local variable, the compiler
89
        may assume any initial value for it.
90
 
91
 
92
   After propagation, every variable V_i that ends up with a lattice
93
   value of CONSTANT will have the associated constant value in the
94
   array CONST_VAL[i].VALUE.  That is fed into substitute_and_fold for
95
   final substitution and folding.
96
 
97
 
98
   Constant propagation in stores and loads (STORE-CCP)
99
   ----------------------------------------------------
100
 
101
   While CCP has all the logic to propagate constants in GIMPLE
102
   registers, it is missing the ability to associate constants with
103
   stores and loads (i.e., pointer dereferences, structures and
104
   global/aliased variables).  We don't keep loads and stores in
105
   SSA, but we do build a factored use-def web for them (in the
106
   virtual operands).
107
 
108
   For instance, consider the following code fragment:
109
 
110
          struct A a;
111
          const int B = 42;
112
 
113
          void foo (int i)
114
          {
115
            if (i > 10)
116
              a.a = 42;
117
            else
118
              {
119
                a.b = 21;
120
                a.a = a.b + 21;
121
              }
122
 
123
            if (a.a != B)
124
              never_executed ();
125
          }
126
 
127
   We should be able to deduce that the predicate 'a.a != B' is always
128
   false.  To achieve this, we associate constant values to the SSA
129
   names in the V_MAY_DEF and V_MUST_DEF operands for each store.
130
   Additionally, since we also glob partial loads/stores with the base
131
   symbol, we also keep track of the memory reference where the
132
   constant value was stored (in the MEM_REF field of PROP_VALUE_T).
133
   For instance,
134
 
135
        # a_5 = V_MAY_DEF <a_4>
136
        a.a = 2;
137
 
138
        # VUSE <a_5>
139
        x_3 = a.b;
140
 
141
   In the example above, CCP will associate value '2' with 'a_5', but
142
   it would be wrong to replace the load from 'a.b' with '2', because
143
   '2' had been stored into a.a.
144
 
145
   To support STORE-CCP, it is necessary to add a new value to the
146
   constant propagation lattice.  When evaluating a load for a memory
147
   reference we can no longer assume a value of UNDEFINED if we
148
   haven't seen a preceding store to the same memory location.
149
   Consider, for instance global variables:
150
 
151
        int A;
152
 
153
        foo (int i)
154
        {
155
          if (i_3 > 10)
156
            A_4 = 3;
157
          # A_5 = PHI (A_4, A_2);
158
 
159
          # VUSE <A_5>
160
          A.0_6 = A;
161
 
162
          return A.0_6;
163
        }
164
 
165
   The value of A_2 cannot be assumed to be UNDEFINED, as it may have
166
   been defined outside of foo.  If we were to assume it UNDEFINED, we
167
   would erroneously optimize the above into 'return 3;'.  Therefore,
168
   when doing STORE-CCP, we introduce a fifth lattice value
169
   (UNKNOWN_VAL), which overrides any other value when computing the
170
   meet operation in PHI nodes.
171
 
172
   Though STORE-CCP is not too expensive, it does have to do more work
173
   than regular CCP, so it is only enabled at -O2.  Both regular CCP
174
   and STORE-CCP use the exact same algorithm.  The only distinction
175
   is that when doing STORE-CCP, the boolean variable DO_STORE_CCP is
176
   set to true.  This affects the evaluation of statements and PHI
177
   nodes.
178
 
179
   References:
180
 
181
     Constant propagation with conditional branches,
182
     Wegman and Zadeck, ACM TOPLAS 13(2):181-210.
183
 
184
     Building an Optimizing Compiler,
185
     Robert Morgan, Butterworth-Heinemann, 1998, Section 8.9.
186
 
187
     Advanced Compiler Design and Implementation,
188
     Steven Muchnick, Morgan Kaufmann, 1997, Section 12.6  */
189
 
190
#include "config.h"
191
#include "system.h"
192
#include "coretypes.h"
193
#include "tm.h"
194
#include "tree.h"
195
#include "flags.h"
196
#include "rtl.h"
197
#include "tm_p.h"
198
#include "ggc.h"
199
#include "basic-block.h"
200
#include "output.h"
201
#include "expr.h"
202
#include "function.h"
203
#include "diagnostic.h"
204
#include "timevar.h"
205
#include "tree-dump.h"
206
#include "tree-flow.h"
207
#include "tree-pass.h"
208
#include "tree-ssa-propagate.h"
209
#include "langhooks.h"
210
#include "target.h"
211
#include "toplev.h"
212
 
213
 
214
/* Possible lattice values.  */
215
typedef enum
216
{
217
  UNINITIALIZED = 0,
218
  UNDEFINED,
219
  UNKNOWN_VAL,
220
  CONSTANT,
221
  VARYING
222
} ccp_lattice_t;
223
 
224
/* Array of propagated constant values.  After propagation,
225
   CONST_VAL[I].VALUE holds the constant value for SSA_NAME(I).  If
226
   the constant is held in an SSA name representing a memory store
227
   (i.e., a V_MAY_DEF or V_MUST_DEF), CONST_VAL[I].MEM_REF will
228
   contain the actual memory reference used to store (i.e., the LHS of
229
   the assignment doing the store).  */
230
static prop_value_t *const_val;
231
 
232
/* True if we are also propagating constants in stores and loads.  */
233
static bool do_store_ccp;
234
 
235
/* Dump constant propagation value VAL to file OUTF prefixed by PREFIX.  */
236
 
237
static void
238
dump_lattice_value (FILE *outf, const char *prefix, prop_value_t val)
239
{
240
  switch (val.lattice_val)
241
    {
242
    case UNINITIALIZED:
243
      fprintf (outf, "%sUNINITIALIZED", prefix);
244
      break;
245
    case UNDEFINED:
246
      fprintf (outf, "%sUNDEFINED", prefix);
247
      break;
248
    case VARYING:
249
      fprintf (outf, "%sVARYING", prefix);
250
      break;
251
    case UNKNOWN_VAL:
252
      fprintf (outf, "%sUNKNOWN_VAL", prefix);
253
      break;
254
    case CONSTANT:
255
      fprintf (outf, "%sCONSTANT ", prefix);
256
      print_generic_expr (outf, val.value, dump_flags);
257
      break;
258
    default:
259
      gcc_unreachable ();
260
    }
261
}
262
 
263
 
264
/* Print lattice value VAL to stderr.  */
265
 
266
void debug_lattice_value (prop_value_t val);
267
 
268
void
269
debug_lattice_value (prop_value_t val)
270
{
271
  dump_lattice_value (stderr, "", val);
272
  fprintf (stderr, "\n");
273
}
274
 
275
 
276
/* The regular is_gimple_min_invariant does a shallow test of the object.
277
   It assumes that full gimplification has happened, or will happen on the
278
   object.  For a value coming from DECL_INITIAL, this is not true, so we
279
   have to be more strict ourselves.  */
280
 
281
static bool
282
ccp_decl_initial_min_invariant (tree t)
283
{
284
  if (!is_gimple_min_invariant (t))
285
    return false;
286
  if (TREE_CODE (t) == ADDR_EXPR)
287
    {
288
      /* Inline and unroll is_gimple_addressable.  */
289
      while (1)
290
        {
291
          t = TREE_OPERAND (t, 0);
292
          if (is_gimple_id (t))
293
            return true;
294
          if (!handled_component_p (t))
295
            return false;
296
        }
297
    }
298
  return true;
299
}
300
 
301
 
302
/* Compute a default value for variable VAR and store it in the
303
   CONST_VAL array.  The following rules are used to get default
304
   values:
305
 
306
   1- Global and static variables that are declared constant are
307
      considered CONSTANT.
308
 
309
   2- Any other value is considered UNDEFINED.  This is useful when
310
      considering PHI nodes.  PHI arguments that are undefined do not
311
      change the constant value of the PHI node, which allows for more
312
      constants to be propagated.
313
 
314
   3- If SSA_NAME_VALUE is set and it is a constant, its value is
315
      used.
316
 
317
   4- Variables defined by statements other than assignments and PHI
318
      nodes are considered VARYING.
319
 
320
   5- Variables that are not GIMPLE registers are considered
321
      UNKNOWN_VAL, which is really a stronger version of UNDEFINED.
322
      It's used to avoid the short circuit evaluation implied by
323
      UNDEFINED in ccp_lattice_meet.  */
324
 
325
static prop_value_t
326
get_default_value (tree var)
327
{
328
  tree sym = SSA_NAME_VAR (var);
329
  prop_value_t val = { UNINITIALIZED, NULL_TREE, NULL_TREE };
330
 
331
  if (!do_store_ccp && !is_gimple_reg (var))
332
    {
333
      /* Short circuit for regular CCP.  We are not interested in any
334
         non-register when DO_STORE_CCP is false.  */
335
      val.lattice_val = VARYING;
336
    }
337
  else if (SSA_NAME_VALUE (var)
338
           && is_gimple_min_invariant (SSA_NAME_VALUE (var)))
339
    {
340
      val.lattice_val = CONSTANT;
341
      val.value = SSA_NAME_VALUE (var);
342
    }
343
  else if (TREE_STATIC (sym)
344
           && TREE_READONLY (sym)
345
           && !MTAG_P (sym)
346
           && DECL_INITIAL (sym)
347
           && ccp_decl_initial_min_invariant (DECL_INITIAL (sym)))
348
    {
349
      /* Globals and static variables declared 'const' take their
350
         initial value.  */
351
      val.lattice_val = CONSTANT;
352
      val.value = DECL_INITIAL (sym);
353
      val.mem_ref = sym;
354
    }
355
  else
356
    {
357
      tree stmt = SSA_NAME_DEF_STMT (var);
358
 
359
      if (IS_EMPTY_STMT (stmt))
360
        {
361
          /* Variables defined by an empty statement are those used
362
             before being initialized.  If VAR is a local variable, we
363
             can assume initially that it is UNDEFINED.  If we are
364
             doing STORE-CCP, function arguments and non-register
365
             variables are initially UNKNOWN_VAL, because we cannot
366
             discard the value incoming from outside of this function
367
             (see ccp_lattice_meet for details).  */
368
          if (is_gimple_reg (sym) && TREE_CODE (sym) != PARM_DECL)
369
            val.lattice_val = UNDEFINED;
370
          else if (do_store_ccp)
371
            val.lattice_val = UNKNOWN_VAL;
372
          else
373
            val.lattice_val = VARYING;
374
        }
375
      else if (TREE_CODE (stmt) == MODIFY_EXPR
376
               || TREE_CODE (stmt) == PHI_NODE)
377
        {
378
          /* Any other variable defined by an assignment or a PHI node
379
             is considered UNDEFINED (or UNKNOWN_VAL if VAR is not a
380
             GIMPLE register).  */
381
          val.lattice_val = is_gimple_reg (sym) ? UNDEFINED : UNKNOWN_VAL;
382
        }
383
      else
384
        {
385
          /* Otherwise, VAR will never take on a constant value.  */
386
          val.lattice_val = VARYING;
387
        }
388
    }
389
 
390
  return val;
391
}
392
 
393
 
394
/* Get the constant value associated with variable VAR.  If
395
   MAY_USE_DEFAULT_P is true, call get_default_value on variables that
396
   have the lattice value UNINITIALIZED.  */
397
 
398
static prop_value_t *
399
get_value (tree var, bool may_use_default_p)
400
{
401
  prop_value_t *val = &const_val[SSA_NAME_VERSION (var)];
402
  if (may_use_default_p && val->lattice_val == UNINITIALIZED)
403
    *val = get_default_value (var);
404
 
405
  return val;
406
}
407
 
408
 
409
/* Set the value for variable VAR to NEW_VAL.  Return true if the new
410
   value is different from VAR's previous value.  */
411
 
412
static bool
413
set_lattice_value (tree var, prop_value_t new_val)
414
{
415
  prop_value_t *old_val = get_value (var, false);
416
 
417
  /* Lattice transitions must always be monotonically increasing in
418
     value.  We allow two exceptions:
419
 
420
     1- If *OLD_VAL and NEW_VAL are the same, return false to
421
        inform the caller that this was a non-transition.
422
 
423
     2- If we are doing store-ccp (i.e., DOING_STORE_CCP is true),
424
        allow CONSTANT->UNKNOWN_VAL.  The UNKNOWN_VAL state is a
425
        special type of UNDEFINED state which prevents the short
426
        circuit evaluation of PHI arguments (see ccp_visit_phi_node
427
        and ccp_lattice_meet).  */
428
  gcc_assert (old_val->lattice_val <= new_val.lattice_val
429
              || (old_val->lattice_val == new_val.lattice_val
430
                  && old_val->value == new_val.value
431
                  && old_val->mem_ref == new_val.mem_ref)
432
              || (do_store_ccp
433
                  && old_val->lattice_val == CONSTANT
434
                  && new_val.lattice_val == UNKNOWN_VAL));
435
 
436
  if (old_val->lattice_val != new_val.lattice_val)
437
    {
438
      if (dump_file && (dump_flags & TDF_DETAILS))
439
        {
440
          dump_lattice_value (dump_file, "Lattice value changed to ", new_val);
441
          fprintf (dump_file, ".  %sdding SSA edges to worklist.\n",
442
                   new_val.lattice_val != UNDEFINED ? "A" : "Not a");
443
        }
444
 
445
      *old_val = new_val;
446
 
447
      /* Transitions UNINITIALIZED -> UNDEFINED are never interesting
448
         for propagation purposes.  In these cases return false to
449
         avoid doing useless work.  */
450
      return (new_val.lattice_val != UNDEFINED);
451
    }
452
 
453
  return false;
454
}
455
 
456
 
457
/* Return the likely CCP lattice value for STMT.
458
 
459
   If STMT has no operands, then return CONSTANT.
460
 
461
   Else if any operands of STMT are undefined, then return UNDEFINED.
462
 
463
   Else if any operands of STMT are constants, then return CONSTANT.
464
 
465
   Else return VARYING.  */
466
 
467
static ccp_lattice_t
468
likely_value (tree stmt)
469
{
470
  bool found_constant;
471
  stmt_ann_t ann;
472
  tree use;
473
  ssa_op_iter iter;
474
 
475
  ann = stmt_ann (stmt);
476
 
477
  /* If the statement has volatile operands, it won't fold to a
478
     constant value.  */
479
  if (ann->has_volatile_ops)
480
    return VARYING;
481
 
482
  /* If we are not doing store-ccp, statements with loads
483
     and/or stores will never fold into a constant.  */
484
  if (!do_store_ccp
485
      && !ZERO_SSA_OPERANDS (stmt, SSA_OP_ALL_VIRTUALS))
486
    return VARYING;
487
 
488
 
489
  /* A CALL_EXPR is assumed to be varying.  NOTE: This may be overly
490
     conservative, in the presence of const and pure calls.  */
491
  if (get_call_expr_in (stmt) != NULL_TREE)
492
    return VARYING;
493
 
494
  /* Anything other than assignments and conditional jumps are not
495
     interesting for CCP.  */
496
  if (TREE_CODE (stmt) != MODIFY_EXPR
497
      && TREE_CODE (stmt) != COND_EXPR
498
      && TREE_CODE (stmt) != SWITCH_EXPR)
499
    return VARYING;
500
 
501
  if (is_gimple_min_invariant (get_rhs (stmt)))
502
    return CONSTANT;
503
 
504
  found_constant = false;
505
  FOR_EACH_SSA_TREE_OPERAND (use, stmt, iter, SSA_OP_USE|SSA_OP_VUSE)
506
    {
507
      prop_value_t *val = get_value (use, true);
508
 
509
      if (val->lattice_val == VARYING)
510
        return VARYING;
511
 
512
      if (val->lattice_val == UNKNOWN_VAL)
513
        {
514
          /* UNKNOWN_VAL is invalid when not doing STORE-CCP.  */
515
          gcc_assert (do_store_ccp);
516
          return UNKNOWN_VAL;
517
        }
518
 
519
      if (val->lattice_val == CONSTANT)
520
        found_constant = true;
521
    }
522
 
523
  if (found_constant
524
      || ZERO_SSA_OPERANDS (stmt, SSA_OP_USE)
525
      || ZERO_SSA_OPERANDS (stmt, SSA_OP_VUSE))
526
    return CONSTANT;
527
 
528
  return UNDEFINED;
529
}
530
 
531
 
532
/* Initialize local data structures for CCP.  */
533
 
534
static void
535
ccp_initialize (void)
536
{
537
  basic_block bb;
538
 
539
  const_val = XNEWVEC (prop_value_t, num_ssa_names);
540
  memset (const_val, 0, num_ssa_names * sizeof (*const_val));
541
 
542
  /* Initialize simulation flags for PHI nodes and statements.  */
543
  FOR_EACH_BB (bb)
544
    {
545
      block_stmt_iterator i;
546
 
547
      for (i = bsi_start (bb); !bsi_end_p (i); bsi_next (&i))
548
        {
549
          bool is_varying = false;
550
          tree stmt = bsi_stmt (i);
551
 
552
          if (likely_value (stmt) == VARYING)
553
 
554
            {
555
              tree def;
556
              ssa_op_iter iter;
557
 
558
              /* If the statement will not produce a constant, mark
559
                 all its outputs VARYING.  */
560
              FOR_EACH_SSA_TREE_OPERAND (def, stmt, iter, SSA_OP_ALL_DEFS)
561
                get_value (def, false)->lattice_val = VARYING;
562
 
563
              /* Never mark conditional jumps with DONT_SIMULATE_AGAIN,
564
                 otherwise the propagator will never add the outgoing
565
                 control edges.  */
566
              if (TREE_CODE (stmt) != COND_EXPR
567
                  && TREE_CODE (stmt) != SWITCH_EXPR)
568
                is_varying = true;
569
            }
570
 
571
          DONT_SIMULATE_AGAIN (stmt) = is_varying;
572
        }
573
    }
574
 
575
  /* Now process PHI nodes.  */
576
  FOR_EACH_BB (bb)
577
    {
578
      tree phi;
579
 
580
      for (phi = phi_nodes (bb); phi; phi = PHI_CHAIN (phi))
581
        {
582
          int i;
583
          tree arg;
584
          prop_value_t *val = get_value (PHI_RESULT (phi), false);
585
 
586
          for (i = 0; i < PHI_NUM_ARGS (phi); i++)
587
            {
588
              arg = PHI_ARG_DEF (phi, i);
589
 
590
              if (TREE_CODE (arg) == SSA_NAME
591
                  && get_value (arg, false)->lattice_val == VARYING)
592
                {
593
                  val->lattice_val = VARYING;
594
                  break;
595
                }
596
            }
597
 
598
          DONT_SIMULATE_AGAIN (phi) = (val->lattice_val == VARYING);
599
        }
600
    }
601
}
602
 
603
 
604
/* Do final substitution of propagated values, cleanup the flowgraph and
605
   free allocated storage.  */
606
 
607
static void
608
ccp_finalize (void)
609
{
610
  /* Perform substitutions based on the known constant values.  */
611
  substitute_and_fold (const_val, false);
612
 
613
  free (const_val);
614
}
615
 
616
 
617
/* Compute the meet operator between *VAL1 and *VAL2.  Store the result
618
   in VAL1.
619
 
620
                any  M UNDEFINED   = any
621
                any  M UNKNOWN_VAL = UNKNOWN_VAL
622
                any  M VARYING     = VARYING
623
                Ci   M Cj          = Ci         if (i == j)
624
                Ci   M Cj          = VARYING    if (i != j)
625
 
626
   Lattice values UNKNOWN_VAL and UNDEFINED are similar but have
627
   different semantics at PHI nodes.  Both values imply that we don't
628
   know whether the variable is constant or not.  However, UNKNOWN_VAL
629
   values override all others.  For instance, suppose that A is a
630
   global variable:
631
 
632
                +------+
633
                |      |
634
                |     / \
635
                |    /   \
636
                |   |  A_1 = 4
637
                |    \   /
638
                |     \ /
639
                | A_3 = PHI (A_2, A_1)
640
                | ... = A_3
641
                |    |
642
                +----+
643
 
644
   If the edge into A_2 is not executable, the first visit to A_3 will
645
   yield the constant 4.  But the second visit to A_3 will be with A_2
646
   in state UNKNOWN_VAL.  We can no longer conclude that A_3 is 4
647
   because A_2 may have been set in another function.  If we had used
648
   the lattice value UNDEFINED, we would have had wrongly concluded
649
   that A_3 is 4.  */
650
 
651
 
652
static void
653
ccp_lattice_meet (prop_value_t *val1, prop_value_t *val2)
654
{
655
  if (val1->lattice_val == UNDEFINED)
656
    {
657
      /* UNDEFINED M any = any   */
658
      *val1 = *val2;
659
    }
660
  else if (val2->lattice_val == UNDEFINED)
661
    {
662
      /* any M UNDEFINED = any
663
         Nothing to do.  VAL1 already contains the value we want.  */
664
      ;
665
    }
666
  else if (val1->lattice_val == UNKNOWN_VAL
667
           || val2->lattice_val == UNKNOWN_VAL)
668
    {
669
      /* UNKNOWN_VAL values are invalid if we are not doing STORE-CCP.  */
670
      gcc_assert (do_store_ccp);
671
 
672
      /* any M UNKNOWN_VAL = UNKNOWN_VAL.  */
673
      val1->lattice_val = UNKNOWN_VAL;
674
      val1->value = NULL_TREE;
675
      val1->mem_ref = NULL_TREE;
676
    }
677
  else if (val1->lattice_val == VARYING
678
           || val2->lattice_val == VARYING)
679
    {
680
      /* any M VARYING = VARYING.  */
681
      val1->lattice_val = VARYING;
682
      val1->value = NULL_TREE;
683
      val1->mem_ref = NULL_TREE;
684
    }
685
  else if (val1->lattice_val == CONSTANT
686
           && val2->lattice_val == CONSTANT
687
           && simple_cst_equal (val1->value, val2->value) == 1
688
           && (!do_store_ccp
689
               || (val1->mem_ref && val2->mem_ref
690
                   && operand_equal_p (val1->mem_ref, val2->mem_ref, 0))))
691
    {
692
      /* Ci M Cj = Ci           if (i == j)
693
         Ci M Cj = VARYING      if (i != j)
694
 
695
         If these two values come from memory stores, make sure that
696
         they come from the same memory reference.  */
697
      val1->lattice_val = CONSTANT;
698
      val1->value = val1->value;
699
      val1->mem_ref = val1->mem_ref;
700
    }
701
  else
702
    {
703
      /* Any other combination is VARYING.  */
704
      val1->lattice_val = VARYING;
705
      val1->value = NULL_TREE;
706
      val1->mem_ref = NULL_TREE;
707
    }
708
}
709
 
710
 
711
/* Loop through the PHI_NODE's parameters for BLOCK and compare their
712
   lattice values to determine PHI_NODE's lattice value.  The value of a
713
   PHI node is determined calling ccp_lattice_meet with all the arguments
714
   of the PHI node that are incoming via executable edges.  */
715
 
716
static enum ssa_prop_result
717
ccp_visit_phi_node (tree phi)
718
{
719
  int i;
720
  prop_value_t *old_val, new_val;
721
 
722
  if (dump_file && (dump_flags & TDF_DETAILS))
723
    {
724
      fprintf (dump_file, "\nVisiting PHI node: ");
725
      print_generic_expr (dump_file, phi, dump_flags);
726
    }
727
 
728
  old_val = get_value (PHI_RESULT (phi), false);
729
  switch (old_val->lattice_val)
730
    {
731
    case VARYING:
732
      return SSA_PROP_VARYING;
733
 
734
    case CONSTANT:
735
      new_val = *old_val;
736
      break;
737
 
738
    case UNKNOWN_VAL:
739
      /* To avoid the default value of UNKNOWN_VAL overriding
740
         that of its possible constant arguments, temporarily
741
         set the PHI node's default lattice value to be
742
         UNDEFINED.  If the PHI node's old value was UNKNOWN_VAL and
743
         the new value is UNDEFINED, then we prevent the invalid
744
         transition by not calling set_lattice_value.  */
745
      gcc_assert (do_store_ccp);
746
 
747
      /* FALLTHRU  */
748
 
749
    case UNDEFINED:
750
    case UNINITIALIZED:
751
      new_val.lattice_val = UNDEFINED;
752
      new_val.value = NULL_TREE;
753
      new_val.mem_ref = NULL_TREE;
754
      break;
755
 
756
    default:
757
      gcc_unreachable ();
758
    }
759
 
760
  for (i = 0; i < PHI_NUM_ARGS (phi); i++)
761
    {
762
      /* Compute the meet operator over all the PHI arguments flowing
763
         through executable edges.  */
764
      edge e = PHI_ARG_EDGE (phi, i);
765
 
766
      if (dump_file && (dump_flags & TDF_DETAILS))
767
        {
768
          fprintf (dump_file,
769
              "\n    Argument #%d (%d -> %d %sexecutable)\n",
770
              i, e->src->index, e->dest->index,
771
              (e->flags & EDGE_EXECUTABLE) ? "" : "not ");
772
        }
773
 
774
      /* If the incoming edge is executable, Compute the meet operator for
775
         the existing value of the PHI node and the current PHI argument.  */
776
      if (e->flags & EDGE_EXECUTABLE)
777
        {
778
          tree arg = PHI_ARG_DEF (phi, i);
779
          prop_value_t arg_val;
780
 
781
          if (is_gimple_min_invariant (arg))
782
            {
783
              arg_val.lattice_val = CONSTANT;
784
              arg_val.value = arg;
785
              arg_val.mem_ref = NULL_TREE;
786
            }
787
          else
788
            arg_val = *(get_value (arg, true));
789
 
790
          ccp_lattice_meet (&new_val, &arg_val);
791
 
792
          if (dump_file && (dump_flags & TDF_DETAILS))
793
            {
794
              fprintf (dump_file, "\t");
795
              print_generic_expr (dump_file, arg, dump_flags);
796
              dump_lattice_value (dump_file, "\tValue: ", arg_val);
797
              fprintf (dump_file, "\n");
798
            }
799
 
800
          if (new_val.lattice_val == VARYING)
801
            break;
802
        }
803
    }
804
 
805
  if (dump_file && (dump_flags & TDF_DETAILS))
806
    {
807
      dump_lattice_value (dump_file, "\n    PHI node value: ", new_val);
808
      fprintf (dump_file, "\n\n");
809
    }
810
 
811
  /* Check for an invalid change from UNKNOWN_VAL to UNDEFINED.  */
812
  if (do_store_ccp
813
      && old_val->lattice_val == UNKNOWN_VAL
814
      && new_val.lattice_val == UNDEFINED)
815
    return SSA_PROP_NOT_INTERESTING;
816
 
817
  /* Otherwise, make the transition to the new value.  */
818
  if (set_lattice_value (PHI_RESULT (phi), new_val))
819
    {
820
      if (new_val.lattice_val == VARYING)
821
        return SSA_PROP_VARYING;
822
      else
823
        return SSA_PROP_INTERESTING;
824
    }
825
  else
826
    return SSA_PROP_NOT_INTERESTING;
827
}
828
 
829
 
830
/* CCP specific front-end to the non-destructive constant folding
831
   routines.
832
 
833
   Attempt to simplify the RHS of STMT knowing that one or more
834
   operands are constants.
835
 
836
   If simplification is possible, return the simplified RHS,
837
   otherwise return the original RHS.  */
838
 
839
static tree
840
ccp_fold (tree stmt)
841
{
842
  tree rhs = get_rhs (stmt);
843
  enum tree_code code = TREE_CODE (rhs);
844
  enum tree_code_class kind = TREE_CODE_CLASS (code);
845
  tree retval = NULL_TREE;
846
 
847
  if (TREE_CODE (rhs) == SSA_NAME)
848
    {
849
      /* If the RHS is an SSA_NAME, return its known constant value,
850
         if any.  */
851
      return get_value (rhs, true)->value;
852
    }
853
  else if (do_store_ccp && stmt_makes_single_load (stmt))
854
    {
855
      /* If the RHS is a memory load, see if the VUSEs associated with
856
         it are a valid constant for that memory load.  */
857
      prop_value_t *val = get_value_loaded_by (stmt, const_val);
858
      if (val && val->mem_ref)
859
        {
860
          if (operand_equal_p (val->mem_ref, rhs, 0))
861
            return val->value;
862
 
863
          /* If RHS is extracting REALPART_EXPR or IMAGPART_EXPR of a
864
             complex type with a known constant value, return it.  */
865
          if ((TREE_CODE (rhs) == REALPART_EXPR
866
               || TREE_CODE (rhs) == IMAGPART_EXPR)
867
              && operand_equal_p (val->mem_ref, TREE_OPERAND (rhs, 0), 0))
868
            return fold_build1 (TREE_CODE (rhs), TREE_TYPE (rhs), val->value);
869
        }
870
      return NULL_TREE;
871
    }
872
 
873
  /* Unary operators.  Note that we know the single operand must
874
     be a constant.  So this should almost always return a
875
     simplified RHS.  */
876
  if (kind == tcc_unary)
877
    {
878
      /* Handle unary operators which can appear in GIMPLE form.  */
879
      tree op0 = TREE_OPERAND (rhs, 0);
880
 
881
      /* Simplify the operand down to a constant.  */
882
      if (TREE_CODE (op0) == SSA_NAME)
883
        {
884
          prop_value_t *val = get_value (op0, true);
885
          if (val->lattice_val == CONSTANT)
886
            op0 = get_value (op0, true)->value;
887
        }
888
 
889
      if ((code == NOP_EXPR || code == CONVERT_EXPR)
890
          && tree_ssa_useless_type_conversion_1 (TREE_TYPE (rhs),
891
                                                 TREE_TYPE (op0)))
892
        return op0;
893
      return fold_unary (code, TREE_TYPE (rhs), op0);
894
    }
895
 
896
  /* Binary and comparison operators.  We know one or both of the
897
     operands are constants.  */
898
  else if (kind == tcc_binary
899
           || kind == tcc_comparison
900
           || code == TRUTH_AND_EXPR
901
           || code == TRUTH_OR_EXPR
902
           || code == TRUTH_XOR_EXPR)
903
    {
904
      /* Handle binary and comparison operators that can appear in
905
         GIMPLE form.  */
906
      tree op0 = TREE_OPERAND (rhs, 0);
907
      tree op1 = TREE_OPERAND (rhs, 1);
908
 
909
      /* Simplify the operands down to constants when appropriate.  */
910
      if (TREE_CODE (op0) == SSA_NAME)
911
        {
912
          prop_value_t *val = get_value (op0, true);
913
          if (val->lattice_val == CONSTANT)
914
            op0 = val->value;
915
        }
916
 
917
      if (TREE_CODE (op1) == SSA_NAME)
918
        {
919
          prop_value_t *val = get_value (op1, true);
920
          if (val->lattice_val == CONSTANT)
921
            op1 = val->value;
922
        }
923
 
924
      return fold_binary (code, TREE_TYPE (rhs), op0, op1);
925
    }
926
 
927
  /* We may be able to fold away calls to builtin functions if their
928
     arguments are constants.  */
929
  else if (code == CALL_EXPR
930
           && TREE_CODE (TREE_OPERAND (rhs, 0)) == ADDR_EXPR
931
           && (TREE_CODE (TREE_OPERAND (TREE_OPERAND (rhs, 0), 0))
932
               == FUNCTION_DECL)
933
           && DECL_BUILT_IN (TREE_OPERAND (TREE_OPERAND (rhs, 0), 0)))
934
    {
935
      if (!ZERO_SSA_OPERANDS (stmt, SSA_OP_USE))
936
        {
937
          tree *orig, var;
938
          tree fndecl, arglist;
939
          size_t i = 0;
940
          ssa_op_iter iter;
941
          use_operand_p var_p;
942
 
943
          /* Preserve the original values of every operand.  */
944
          orig = XNEWVEC (tree,  NUM_SSA_OPERANDS (stmt, SSA_OP_USE));
945
          FOR_EACH_SSA_TREE_OPERAND (var, stmt, iter, SSA_OP_USE)
946
            orig[i++] = var;
947
 
948
          /* Substitute operands with their values and try to fold.  */
949
          replace_uses_in (stmt, NULL, const_val);
950
          fndecl = get_callee_fndecl (rhs);
951
          arglist = TREE_OPERAND (rhs, 1);
952
          retval = fold_builtin (fndecl, arglist, false);
953
 
954
          /* Restore operands to their original form.  */
955
          i = 0;
956
          FOR_EACH_SSA_USE_OPERAND (var_p, stmt, iter, SSA_OP_USE)
957
            SET_USE (var_p, orig[i++]);
958
          free (orig);
959
        }
960
    }
961
  else
962
    return rhs;
963
 
964
  /* If we got a simplified form, see if we need to convert its type.  */
965
  if (retval)
966
    return fold_convert (TREE_TYPE (rhs), retval);
967
 
968
  /* No simplification was possible.  */
969
  return rhs;
970
}
971
 
972
 
973
/* Return the tree representing the element referenced by T if T is an
974
   ARRAY_REF or COMPONENT_REF into constant aggregates.  Return
975
   NULL_TREE otherwise.  */
976
 
977
static tree
978
fold_const_aggregate_ref (tree t)
979
{
980
  prop_value_t *value;
981
  tree base, ctor, idx, field;
982
  unsigned HOST_WIDE_INT cnt;
983
  tree cfield, cval;
984
 
985
  switch (TREE_CODE (t))
986
    {
987
    case ARRAY_REF:
988
      /* Get a CONSTRUCTOR.  If BASE is a VAR_DECL, get its
989
         DECL_INITIAL.  If BASE is a nested reference into another
990
         ARRAY_REF or COMPONENT_REF, make a recursive call to resolve
991
         the inner reference.  */
992
      base = TREE_OPERAND (t, 0);
993
      switch (TREE_CODE (base))
994
        {
995
        case VAR_DECL:
996
          if (!TREE_READONLY (base)
997
              || TREE_CODE (TREE_TYPE (base)) != ARRAY_TYPE
998
              || !targetm.binds_local_p (base))
999
            return NULL_TREE;
1000
 
1001
          ctor = DECL_INITIAL (base);
1002
          break;
1003
 
1004
        case ARRAY_REF:
1005
        case COMPONENT_REF:
1006
          ctor = fold_const_aggregate_ref (base);
1007
          break;
1008
 
1009
        default:
1010
          return NULL_TREE;
1011
        }
1012
 
1013
      if (ctor == NULL_TREE
1014
          || (TREE_CODE (ctor) != CONSTRUCTOR
1015
              && TREE_CODE (ctor) != STRING_CST)
1016
          || !TREE_STATIC (ctor))
1017
        return NULL_TREE;
1018
 
1019
      /* Get the index.  If we have an SSA_NAME, try to resolve it
1020
         with the current lattice value for the SSA_NAME.  */
1021
      idx = TREE_OPERAND (t, 1);
1022
      switch (TREE_CODE (idx))
1023
        {
1024
        case SSA_NAME:
1025
          if ((value = get_value (idx, true))
1026
              && value->lattice_val == CONSTANT
1027
              && TREE_CODE (value->value) == INTEGER_CST)
1028
            idx = value->value;
1029
          else
1030
            return NULL_TREE;
1031
          break;
1032
 
1033
        case INTEGER_CST:
1034
          break;
1035
 
1036
        default:
1037
          return NULL_TREE;
1038
        }
1039
 
1040
      /* Fold read from constant string.  */
1041
      if (TREE_CODE (ctor) == STRING_CST)
1042
        {
1043
          if ((TYPE_MODE (TREE_TYPE (t))
1044
               == TYPE_MODE (TREE_TYPE (TREE_TYPE (ctor))))
1045
              && (GET_MODE_CLASS (TYPE_MODE (TREE_TYPE (TREE_TYPE (ctor))))
1046
                  == MODE_INT)
1047
              && GET_MODE_SIZE (TYPE_MODE (TREE_TYPE (TREE_TYPE (ctor)))) == 1
1048
              && compare_tree_int (idx, TREE_STRING_LENGTH (ctor)) < 0)
1049
            return build_int_cst (TREE_TYPE (t), (TREE_STRING_POINTER (ctor)
1050
                                                  [TREE_INT_CST_LOW (idx)]));
1051
          return NULL_TREE;
1052
        }
1053
 
1054
      /* Whoo-hoo!  I'll fold ya baby.  Yeah!  */
1055
      FOR_EACH_CONSTRUCTOR_ELT (CONSTRUCTOR_ELTS (ctor), cnt, cfield, cval)
1056
        if (tree_int_cst_equal (cfield, idx))
1057
          return cval;
1058
      break;
1059
 
1060
    case COMPONENT_REF:
1061
      /* Get a CONSTRUCTOR.  If BASE is a VAR_DECL, get its
1062
         DECL_INITIAL.  If BASE is a nested reference into another
1063
         ARRAY_REF or COMPONENT_REF, make a recursive call to resolve
1064
         the inner reference.  */
1065
      base = TREE_OPERAND (t, 0);
1066
      switch (TREE_CODE (base))
1067
        {
1068
        case VAR_DECL:
1069
          if (!TREE_READONLY (base)
1070
              || TREE_CODE (TREE_TYPE (base)) != RECORD_TYPE
1071
              || !targetm.binds_local_p (base))
1072
            return NULL_TREE;
1073
 
1074
          ctor = DECL_INITIAL (base);
1075
          break;
1076
 
1077
        case ARRAY_REF:
1078
        case COMPONENT_REF:
1079
          ctor = fold_const_aggregate_ref (base);
1080
          break;
1081
 
1082
        default:
1083
          return NULL_TREE;
1084
        }
1085
 
1086
      if (ctor == NULL_TREE
1087
          || TREE_CODE (ctor) != CONSTRUCTOR
1088
          || !TREE_STATIC (ctor))
1089
        return NULL_TREE;
1090
 
1091
      field = TREE_OPERAND (t, 1);
1092
 
1093
      FOR_EACH_CONSTRUCTOR_ELT (CONSTRUCTOR_ELTS (ctor), cnt, cfield, cval)
1094
        if (cfield == field
1095
            /* FIXME: Handle bit-fields.  */
1096
            && ! DECL_BIT_FIELD (cfield))
1097
          return cval;
1098
      break;
1099
 
1100
    case REALPART_EXPR:
1101
    case IMAGPART_EXPR:
1102
      {
1103
        tree c = fold_const_aggregate_ref (TREE_OPERAND (t, 0));
1104
        if (c && TREE_CODE (c) == COMPLEX_CST)
1105
          return fold_build1 (TREE_CODE (t), TREE_TYPE (t), c);
1106
        break;
1107
      }
1108
 
1109
    default:
1110
      break;
1111
    }
1112
 
1113
  return NULL_TREE;
1114
}
1115
 
1116
/* Evaluate statement STMT.  */
1117
 
1118
static prop_value_t
1119
evaluate_stmt (tree stmt)
1120
{
1121
  prop_value_t val;
1122
  tree simplified = NULL_TREE;
1123
  ccp_lattice_t likelyvalue = likely_value (stmt);
1124
  bool is_constant;
1125
 
1126
  val.mem_ref = NULL_TREE;
1127
 
1128
  fold_defer_overflow_warnings ();
1129
 
1130
  /* If the statement is likely to have a CONSTANT result, then try
1131
     to fold the statement to determine the constant value.  */
1132
  if (likelyvalue == CONSTANT)
1133
    simplified = ccp_fold (stmt);
1134
  /* If the statement is likely to have a VARYING result, then do not
1135
     bother folding the statement.  */
1136
  if (likelyvalue == VARYING)
1137
    simplified = get_rhs (stmt);
1138
  /* If the statement is an ARRAY_REF or COMPONENT_REF into constant
1139
     aggregates, extract the referenced constant.  Otherwise the
1140
     statement is likely to have an UNDEFINED value, and there will be
1141
     nothing to do.  Note that fold_const_aggregate_ref returns
1142
     NULL_TREE if the first case does not match.  */
1143
  else if (!simplified)
1144
    simplified = fold_const_aggregate_ref (get_rhs (stmt));
1145
 
1146
  is_constant = simplified && is_gimple_min_invariant (simplified);
1147
 
1148
  fold_undefer_overflow_warnings (is_constant, stmt, 0);
1149
 
1150
  if (is_constant)
1151
    {
1152
      /* The statement produced a constant value.  */
1153
      val.lattice_val = CONSTANT;
1154
      val.value = simplified;
1155
    }
1156
  else
1157
    {
1158
      /* The statement produced a nonconstant value.  If the statement
1159
         had UNDEFINED operands, then the result of the statement
1160
         should be UNDEFINED.  Otherwise, the statement is VARYING.  */
1161
      if (likelyvalue == UNDEFINED || likelyvalue == UNKNOWN_VAL)
1162
        val.lattice_val = likelyvalue;
1163
      else
1164
        val.lattice_val = VARYING;
1165
 
1166
      val.value = NULL_TREE;
1167
    }
1168
 
1169
  return val;
1170
}
1171
 
1172
 
1173
/* Visit the assignment statement STMT.  Set the value of its LHS to the
1174
   value computed by the RHS and store LHS in *OUTPUT_P.  If STMT
1175
   creates virtual definitions, set the value of each new name to that
1176
   of the RHS (if we can derive a constant out of the RHS).  */
1177
 
1178
static enum ssa_prop_result
1179
visit_assignment (tree stmt, tree *output_p)
1180
{
1181
  prop_value_t val;
1182
  tree lhs, rhs;
1183
  enum ssa_prop_result retval;
1184
 
1185
  lhs = TREE_OPERAND (stmt, 0);
1186
  rhs = TREE_OPERAND (stmt, 1);
1187
 
1188
  if (TREE_CODE (rhs) == SSA_NAME)
1189
    {
1190
      /* For a simple copy operation, we copy the lattice values.  */
1191
      prop_value_t *nval = get_value (rhs, true);
1192
      val = *nval;
1193
    }
1194
  else if (do_store_ccp && stmt_makes_single_load (stmt))
1195
    {
1196
      /* Same as above, but the RHS is not a gimple register and yet
1197
         has a known VUSE.  If STMT is loading from the same memory
1198
         location that created the SSA_NAMEs for the virtual operands,
1199
         we can propagate the value on the RHS.  */
1200
      prop_value_t *nval = get_value_loaded_by (stmt, const_val);
1201
 
1202
      if (nval && nval->mem_ref
1203
          && operand_equal_p (nval->mem_ref, rhs, 0))
1204
        val = *nval;
1205
      else
1206
        val = evaluate_stmt (stmt);
1207
    }
1208
  else
1209
    /* Evaluate the statement.  */
1210
      val = evaluate_stmt (stmt);
1211
 
1212
  /* If the original LHS was a VIEW_CONVERT_EXPR, modify the constant
1213
     value to be a VIEW_CONVERT_EXPR of the old constant value.
1214
 
1215
     ??? Also, if this was a definition of a bitfield, we need to widen
1216
     the constant value into the type of the destination variable.  This
1217
     should not be necessary if GCC represented bitfields properly.  */
1218
  {
1219
    tree orig_lhs = TREE_OPERAND (stmt, 0);
1220
 
1221
    if (TREE_CODE (orig_lhs) == VIEW_CONVERT_EXPR
1222
        && val.lattice_val == CONSTANT)
1223
      {
1224
        tree w = fold_unary (VIEW_CONVERT_EXPR,
1225
                             TREE_TYPE (TREE_OPERAND (orig_lhs, 0)),
1226
                             val.value);
1227
 
1228
        orig_lhs = TREE_OPERAND (orig_lhs, 0);
1229
        if (w && is_gimple_min_invariant (w))
1230
          val.value = w;
1231
        else
1232
          {
1233
            val.lattice_val = VARYING;
1234
            val.value = NULL;
1235
          }
1236
      }
1237
 
1238
    if (val.lattice_val == CONSTANT
1239
        && TREE_CODE (orig_lhs) == COMPONENT_REF
1240
        && DECL_BIT_FIELD (TREE_OPERAND (orig_lhs, 1)))
1241
      {
1242
        tree w = widen_bitfield (val.value, TREE_OPERAND (orig_lhs, 1),
1243
                                 orig_lhs);
1244
 
1245
        if (w && is_gimple_min_invariant (w))
1246
          val.value = w;
1247
        else
1248
          {
1249
            val.lattice_val = VARYING;
1250
            val.value = NULL_TREE;
1251
            val.mem_ref = NULL_TREE;
1252
          }
1253
      }
1254
  }
1255
 
1256
  retval = SSA_PROP_NOT_INTERESTING;
1257
 
1258
  /* Set the lattice value of the statement's output.  */
1259
  if (TREE_CODE (lhs) == SSA_NAME)
1260
    {
1261
      /* If STMT is an assignment to an SSA_NAME, we only have one
1262
         value to set.  */
1263
      if (set_lattice_value (lhs, val))
1264
        {
1265
          *output_p = lhs;
1266
          if (val.lattice_val == VARYING)
1267
            retval = SSA_PROP_VARYING;
1268
          else
1269
            retval = SSA_PROP_INTERESTING;
1270
        }
1271
    }
1272
  else if (do_store_ccp && stmt_makes_single_store (stmt))
1273
    {
1274
      /* Otherwise, set the names in V_MAY_DEF/V_MUST_DEF operands
1275
         to the new constant value and mark the LHS as the memory
1276
         reference associated with VAL.  */
1277
      ssa_op_iter i;
1278
      tree vdef;
1279
      bool changed;
1280
 
1281
      /* Stores cannot take on an UNDEFINED value.  */
1282
      if (val.lattice_val == UNDEFINED)
1283
        val.lattice_val = UNKNOWN_VAL;
1284
 
1285
      /* Mark VAL as stored in the LHS of this assignment.  */
1286
      val.mem_ref = lhs;
1287
 
1288
      /* Set the value of every VDEF to VAL.  */
1289
      changed = false;
1290
      FOR_EACH_SSA_TREE_OPERAND (vdef, stmt, i, SSA_OP_VIRTUAL_DEFS)
1291
        changed |= set_lattice_value (vdef, val);
1292
 
1293
      /* Note that for propagation purposes, we are only interested in
1294
         visiting statements that load the exact same memory reference
1295
         stored here.  Those statements will have the exact same list
1296
         of virtual uses, so it is enough to set the output of this
1297
         statement to be its first virtual definition.  */
1298
      *output_p = first_vdef (stmt);
1299
      if (changed)
1300
        {
1301
          if (val.lattice_val == VARYING)
1302
            retval = SSA_PROP_VARYING;
1303
          else
1304
            retval = SSA_PROP_INTERESTING;
1305
        }
1306
    }
1307
 
1308
  return retval;
1309
}
1310
 
1311
 
1312
/* Visit the conditional statement STMT.  Return SSA_PROP_INTERESTING
1313
   if it can determine which edge will be taken.  Otherwise, return
1314
   SSA_PROP_VARYING.  */
1315
 
1316
static enum ssa_prop_result
1317
visit_cond_stmt (tree stmt, edge *taken_edge_p)
1318
{
1319
  prop_value_t val;
1320
  basic_block block;
1321
 
1322
  block = bb_for_stmt (stmt);
1323
  val = evaluate_stmt (stmt);
1324
 
1325
  /* Find which edge out of the conditional block will be taken and add it
1326
     to the worklist.  If no single edge can be determined statically,
1327
     return SSA_PROP_VARYING to feed all the outgoing edges to the
1328
     propagation engine.  */
1329
  *taken_edge_p = val.value ? find_taken_edge (block, val.value) : 0;
1330
  if (*taken_edge_p)
1331
    return SSA_PROP_INTERESTING;
1332
  else
1333
    return SSA_PROP_VARYING;
1334
}
1335
 
1336
 
1337
/* Evaluate statement STMT.  If the statement produces an output value and
1338
   its evaluation changes the lattice value of its output, return
1339
   SSA_PROP_INTERESTING and set *OUTPUT_P to the SSA_NAME holding the
1340
   output value.
1341
 
1342
   If STMT is a conditional branch and we can determine its truth
1343
   value, set *TAKEN_EDGE_P accordingly.  If STMT produces a varying
1344
   value, return SSA_PROP_VARYING.  */
1345
 
1346
static enum ssa_prop_result
1347
ccp_visit_stmt (tree stmt, edge *taken_edge_p, tree *output_p)
1348
{
1349
  tree def;
1350
  ssa_op_iter iter;
1351
 
1352
  if (dump_file && (dump_flags & TDF_DETAILS))
1353
    {
1354
      fprintf (dump_file, "\nVisiting statement:\n");
1355
      print_generic_stmt (dump_file, stmt, dump_flags);
1356
      fprintf (dump_file, "\n");
1357
    }
1358
 
1359
  if (TREE_CODE (stmt) == MODIFY_EXPR)
1360
    {
1361
      /* If the statement is an assignment that produces a single
1362
         output value, evaluate its RHS to see if the lattice value of
1363
         its output has changed.  */
1364
      return visit_assignment (stmt, output_p);
1365
    }
1366
  else if (TREE_CODE (stmt) == COND_EXPR || TREE_CODE (stmt) == SWITCH_EXPR)
1367
    {
1368
      /* If STMT is a conditional branch, see if we can determine
1369
         which branch will be taken.  */
1370
      return visit_cond_stmt (stmt, taken_edge_p);
1371
    }
1372
 
1373
  /* Any other kind of statement is not interesting for constant
1374
     propagation and, therefore, not worth simulating.  */
1375
  if (dump_file && (dump_flags & TDF_DETAILS))
1376
    fprintf (dump_file, "No interesting values produced.  Marked VARYING.\n");
1377
 
1378
  /* Definitions made by statements other than assignments to
1379
     SSA_NAMEs represent unknown modifications to their outputs.
1380
     Mark them VARYING.  */
1381
  FOR_EACH_SSA_TREE_OPERAND (def, stmt, iter, SSA_OP_ALL_DEFS)
1382
    {
1383
      prop_value_t v = { VARYING, NULL_TREE, NULL_TREE };
1384
      set_lattice_value (def, v);
1385
    }
1386
 
1387
  return SSA_PROP_VARYING;
1388
}
1389
 
1390
 
1391
/* Main entry point for SSA Conditional Constant Propagation.  */
1392
 
1393
static void
1394
execute_ssa_ccp (bool store_ccp)
1395
{
1396
  do_store_ccp = store_ccp;
1397
  ccp_initialize ();
1398
  ssa_propagate (ccp_visit_stmt, ccp_visit_phi_node);
1399
  ccp_finalize ();
1400
}
1401
 
1402
 
1403
static unsigned int
1404
do_ssa_ccp (void)
1405
{
1406
  execute_ssa_ccp (false);
1407
  return 0;
1408
}
1409
 
1410
 
1411
static bool
1412
gate_ccp (void)
1413
{
1414
  return flag_tree_ccp != 0;
1415
}
1416
 
1417
 
1418
struct tree_opt_pass pass_ccp =
1419
{
1420
  "ccp",                                /* name */
1421
  gate_ccp,                             /* gate */
1422
  do_ssa_ccp,                           /* execute */
1423
  NULL,                                 /* sub */
1424
  NULL,                                 /* next */
1425
  0,                                     /* static_pass_number */
1426
  TV_TREE_CCP,                          /* tv_id */
1427
  PROP_cfg | PROP_ssa | PROP_alias,     /* properties_required */
1428
  0,                                     /* properties_provided */
1429
  PROP_smt_usage,                       /* properties_destroyed */
1430
  0,                                     /* todo_flags_start */
1431
  TODO_cleanup_cfg | TODO_dump_func | TODO_update_ssa
1432
    | TODO_ggc_collect | TODO_verify_ssa
1433
    | TODO_verify_stmts | TODO_update_smt_usage, /* todo_flags_finish */
1434
 
1435
};
1436
 
1437
 
1438
static unsigned int
1439
do_ssa_store_ccp (void)
1440
{
1441
  /* If STORE-CCP is not enabled, we just run regular CCP.  */
1442
  execute_ssa_ccp (flag_tree_store_ccp != 0);
1443
  return 0;
1444
}
1445
 
1446
static bool
1447
gate_store_ccp (void)
1448
{
1449
  /* STORE-CCP is enabled only with -ftree-store-ccp, but when
1450
     -fno-tree-store-ccp is specified, we should run regular CCP.
1451
     That's why the pass is enabled with either flag.  */
1452
  return flag_tree_store_ccp != 0 || flag_tree_ccp != 0;
1453
}
1454
 
1455
 
1456
struct tree_opt_pass pass_store_ccp =
1457
{
1458
  "store_ccp",                          /* name */
1459
  gate_store_ccp,                       /* gate */
1460
  do_ssa_store_ccp,                     /* execute */
1461
  NULL,                                 /* sub */
1462
  NULL,                                 /* next */
1463
  0,                                     /* static_pass_number */
1464
  TV_TREE_STORE_CCP,                    /* tv_id */
1465
  PROP_cfg | PROP_ssa | PROP_alias,     /* properties_required */
1466
  0,                                     /* properties_provided */
1467
  PROP_smt_usage,                       /* properties_destroyed */
1468
  0,                                     /* todo_flags_start */
1469
  TODO_dump_func | TODO_update_ssa
1470
    | TODO_ggc_collect | TODO_verify_ssa
1471
    | TODO_cleanup_cfg
1472
    | TODO_verify_stmts | TODO_update_smt_usage, /* todo_flags_finish */
1473
 
1474
};
1475
 
1476
/* Given a constant value VAL for bitfield FIELD, and a destination
1477
   variable VAR, return VAL appropriately widened to fit into VAR.  If
1478
   FIELD is wider than HOST_WIDE_INT, NULL is returned.  */
1479
 
1480
tree
1481
widen_bitfield (tree val, tree field, tree var)
1482
{
1483
  unsigned HOST_WIDE_INT var_size, field_size;
1484
  tree wide_val;
1485
  unsigned HOST_WIDE_INT mask;
1486
  unsigned int i;
1487
 
1488
  /* We can only do this if the size of the type and field and VAL are
1489
     all constants representable in HOST_WIDE_INT.  */
1490
  if (!host_integerp (TYPE_SIZE (TREE_TYPE (var)), 1)
1491
      || !host_integerp (DECL_SIZE (field), 1)
1492
      || !host_integerp (val, 0))
1493
    return NULL_TREE;
1494
 
1495
  var_size = tree_low_cst (TYPE_SIZE (TREE_TYPE (var)), 1);
1496
  field_size = tree_low_cst (DECL_SIZE (field), 1);
1497
 
1498
  /* Give up if either the bitfield or the variable are too wide.  */
1499
  if (field_size > HOST_BITS_PER_WIDE_INT || var_size > HOST_BITS_PER_WIDE_INT)
1500
    return NULL_TREE;
1501
 
1502
  gcc_assert (var_size >= field_size);
1503
 
1504
  /* If the sign bit of the value is not set or the field's type is unsigned,
1505
     just mask off the high order bits of the value.  */
1506
  if (DECL_UNSIGNED (field)
1507
      || !(tree_low_cst (val, 0) & (((HOST_WIDE_INT)1) << (field_size - 1))))
1508
    {
1509
      /* Zero extension.  Build a mask with the lower 'field_size' bits
1510
         set and a BIT_AND_EXPR node to clear the high order bits of
1511
         the value.  */
1512
      for (i = 0, mask = 0; i < field_size; i++)
1513
        mask |= ((HOST_WIDE_INT) 1) << i;
1514
 
1515
      wide_val = fold_build2 (BIT_AND_EXPR, TREE_TYPE (var), val,
1516
                              build_int_cst (TREE_TYPE (var), mask));
1517
    }
1518
  else
1519
    {
1520
      /* Sign extension.  Create a mask with the upper 'field_size'
1521
         bits set and a BIT_IOR_EXPR to set the high order bits of the
1522
         value.  */
1523
      for (i = 0, mask = 0; i < (var_size - field_size); i++)
1524
        mask |= ((HOST_WIDE_INT) 1) << (var_size - i - 1);
1525
 
1526
      wide_val = fold_build2 (BIT_IOR_EXPR, TREE_TYPE (var), val,
1527
                              build_int_cst (TREE_TYPE (var), mask));
1528
    }
1529
 
1530
  return wide_val;
1531
}
1532
 
1533
 
1534
/* A subroutine of fold_stmt_r.  Attempts to fold *(A+O) to A[X].
1535
   BASE is an array type.  OFFSET is a byte displacement.  ORIG_TYPE
1536
   is the desired result type.  */
1537
 
1538
static tree
1539
maybe_fold_offset_to_array_ref (tree base, tree offset, tree orig_type)
1540
{
1541
  tree min_idx, idx, elt_offset = integer_zero_node;
1542
  tree array_type, elt_type, elt_size;
1543
 
1544
  /* If BASE is an ARRAY_REF, we can pick up another offset (this time
1545
     measured in units of the size of elements type) from that ARRAY_REF).
1546
     We can't do anything if either is variable.
1547
 
1548
     The case we handle here is *(&A[N]+O).  */
1549
  if (TREE_CODE (base) == ARRAY_REF)
1550
    {
1551
      tree low_bound = array_ref_low_bound (base);
1552
 
1553
      elt_offset = TREE_OPERAND (base, 1);
1554
      if (TREE_CODE (low_bound) != INTEGER_CST
1555
          || TREE_CODE (elt_offset) != INTEGER_CST)
1556
        return NULL_TREE;
1557
 
1558
      elt_offset = int_const_binop (MINUS_EXPR, elt_offset, low_bound, 0);
1559
      base = TREE_OPERAND (base, 0);
1560
    }
1561
 
1562
  /* Ignore stupid user tricks of indexing non-array variables.  */
1563
  array_type = TREE_TYPE (base);
1564
  if (TREE_CODE (array_type) != ARRAY_TYPE)
1565
    return NULL_TREE;
1566
  elt_type = TREE_TYPE (array_type);
1567
  if (!lang_hooks.types_compatible_p (orig_type, elt_type))
1568
    return NULL_TREE;
1569
 
1570
  /* If OFFSET and ELT_OFFSET are zero, we don't care about the size of the
1571
     element type (so we can use the alignment if it's not constant).
1572
     Otherwise, compute the offset as an index by using a division.  If the
1573
     division isn't exact, then don't do anything.  */
1574
  elt_size = TYPE_SIZE_UNIT (elt_type);
1575
  if (integer_zerop (offset))
1576
    {
1577
      if (TREE_CODE (elt_size) != INTEGER_CST)
1578
        elt_size = size_int (TYPE_ALIGN (elt_type));
1579
 
1580
      idx = integer_zero_node;
1581
    }
1582
  else
1583
    {
1584
      unsigned HOST_WIDE_INT lquo, lrem;
1585
      HOST_WIDE_INT hquo, hrem;
1586
 
1587
      if (TREE_CODE (elt_size) != INTEGER_CST
1588
          || div_and_round_double (TRUNC_DIV_EXPR, 1,
1589
                                   TREE_INT_CST_LOW (offset),
1590
                                   TREE_INT_CST_HIGH (offset),
1591
                                   TREE_INT_CST_LOW (elt_size),
1592
                                   TREE_INT_CST_HIGH (elt_size),
1593
                                   &lquo, &hquo, &lrem, &hrem)
1594
          || lrem || hrem)
1595
        return NULL_TREE;
1596
 
1597
      idx = build_int_cst_wide (NULL_TREE, lquo, hquo);
1598
    }
1599
 
1600
  /* Assume the low bound is zero.  If there is a domain type, get the
1601
     low bound, if any, convert the index into that type, and add the
1602
     low bound.  */
1603
  min_idx = integer_zero_node;
1604
  if (TYPE_DOMAIN (array_type))
1605
    {
1606
      if (TYPE_MIN_VALUE (TYPE_DOMAIN (array_type)))
1607
        min_idx = TYPE_MIN_VALUE (TYPE_DOMAIN (array_type));
1608
      else
1609
        min_idx = fold_convert (TYPE_DOMAIN (array_type), min_idx);
1610
 
1611
      if (TREE_CODE (min_idx) != INTEGER_CST)
1612
        return NULL_TREE;
1613
 
1614
      idx = fold_convert (TYPE_DOMAIN (array_type), idx);
1615
      elt_offset = fold_convert (TYPE_DOMAIN (array_type), elt_offset);
1616
    }
1617
 
1618
  if (!integer_zerop (min_idx))
1619
    idx = int_const_binop (PLUS_EXPR, idx, min_idx, 0);
1620
  if (!integer_zerop (elt_offset))
1621
    idx = int_const_binop (PLUS_EXPR, idx, elt_offset, 0);
1622
 
1623
  return build4 (ARRAY_REF, orig_type, base, idx, min_idx,
1624
                 size_int (tree_low_cst (elt_size, 1)
1625
                           / (TYPE_ALIGN_UNIT (elt_type))));
1626
}
1627
 
1628
 
1629
/* A subroutine of fold_stmt_r.  Attempts to fold *(S+O) to S.X.
1630
   BASE is a record type.  OFFSET is a byte displacement.  ORIG_TYPE
1631
   is the desired result type.  */
1632
/* ??? This doesn't handle class inheritance.  */
1633
 
1634
static tree
1635
maybe_fold_offset_to_component_ref (tree record_type, tree base, tree offset,
1636
                                    tree orig_type, bool base_is_ptr)
1637
{
1638
  tree f, t, field_type, tail_array_field, field_offset;
1639
 
1640
  if (TREE_CODE (record_type) != RECORD_TYPE
1641
      && TREE_CODE (record_type) != UNION_TYPE
1642
      && TREE_CODE (record_type) != QUAL_UNION_TYPE)
1643
    return NULL_TREE;
1644
 
1645
  /* Short-circuit silly cases.  */
1646
  if (lang_hooks.types_compatible_p (record_type, orig_type))
1647
    return NULL_TREE;
1648
 
1649
  tail_array_field = NULL_TREE;
1650
  for (f = TYPE_FIELDS (record_type); f ; f = TREE_CHAIN (f))
1651
    {
1652
      int cmp;
1653
 
1654
      if (TREE_CODE (f) != FIELD_DECL)
1655
        continue;
1656
      if (DECL_BIT_FIELD (f))
1657
        continue;
1658
 
1659
      field_offset = byte_position (f);
1660
      if (TREE_CODE (field_offset) != INTEGER_CST)
1661
        continue;
1662
 
1663
      /* ??? Java creates "interesting" fields for representing base classes.
1664
         They have no name, and have no context.  With no context, we get into
1665
         trouble with nonoverlapping_component_refs_p.  Skip them.  */
1666
      if (!DECL_FIELD_CONTEXT (f))
1667
        continue;
1668
 
1669
      /* The previous array field isn't at the end.  */
1670
      tail_array_field = NULL_TREE;
1671
 
1672
      /* Check to see if this offset overlaps with the field.  */
1673
      cmp = tree_int_cst_compare (field_offset, offset);
1674
      if (cmp > 0)
1675
        continue;
1676
 
1677
      field_type = TREE_TYPE (f);
1678
 
1679
      /* Here we exactly match the offset being checked.  If the types match,
1680
         then we can return that field.  */
1681
      if (cmp == 0
1682
          && lang_hooks.types_compatible_p (orig_type, field_type))
1683
        {
1684
          if (base_is_ptr)
1685
            base = build1 (INDIRECT_REF, record_type, base);
1686
          t = build3 (COMPONENT_REF, field_type, base, f, NULL_TREE);
1687
          return t;
1688
        }
1689
 
1690
      /* Don't care about offsets into the middle of scalars.  */
1691
      if (!AGGREGATE_TYPE_P (field_type))
1692
        continue;
1693
 
1694
      /* Check for array at the end of the struct.  This is often
1695
         used as for flexible array members.  We should be able to
1696
         turn this into an array access anyway.  */
1697
      if (TREE_CODE (field_type) == ARRAY_TYPE)
1698
        tail_array_field = f;
1699
 
1700
      /* Check the end of the field against the offset.  */
1701
      if (!DECL_SIZE_UNIT (f)
1702
          || TREE_CODE (DECL_SIZE_UNIT (f)) != INTEGER_CST)
1703
        continue;
1704
      t = int_const_binop (MINUS_EXPR, offset, field_offset, 1);
1705
      if (!tree_int_cst_lt (t, DECL_SIZE_UNIT (f)))
1706
        continue;
1707
 
1708
      /* If we matched, then set offset to the displacement into
1709
         this field.  */
1710
      offset = t;
1711
      goto found;
1712
    }
1713
 
1714
  if (!tail_array_field)
1715
    return NULL_TREE;
1716
 
1717
  f = tail_array_field;
1718
  field_type = TREE_TYPE (f);
1719
  offset = int_const_binop (MINUS_EXPR, offset, byte_position (f), 1);
1720
 
1721
 found:
1722
  /* If we get here, we've got an aggregate field, and a possibly
1723
     nonzero offset into them.  Recurse and hope for a valid match.  */
1724
  if (base_is_ptr)
1725
    base = build1 (INDIRECT_REF, record_type, base);
1726
  base = build3 (COMPONENT_REF, field_type, base, f, NULL_TREE);
1727
 
1728
  t = maybe_fold_offset_to_array_ref (base, offset, orig_type);
1729
  if (t)
1730
    return t;
1731
  return maybe_fold_offset_to_component_ref (field_type, base, offset,
1732
                                             orig_type, false);
1733
}
1734
 
1735
 
1736
/* A subroutine of fold_stmt_r.  Attempt to simplify *(BASE+OFFSET).
1737
   Return the simplified expression, or NULL if nothing could be done.  */
1738
 
1739
static tree
1740
maybe_fold_stmt_indirect (tree expr, tree base, tree offset)
1741
{
1742
  tree t;
1743
 
1744
  /* We may well have constructed a double-nested PLUS_EXPR via multiple
1745
     substitutions.  Fold that down to one.  Remove NON_LVALUE_EXPRs that
1746
     are sometimes added.  */
1747
  base = fold (base);
1748
  STRIP_TYPE_NOPS (base);
1749
  TREE_OPERAND (expr, 0) = base;
1750
 
1751
  /* One possibility is that the address reduces to a string constant.  */
1752
  t = fold_read_from_constant_string (expr);
1753
  if (t)
1754
    return t;
1755
 
1756
  /* Add in any offset from a PLUS_EXPR.  */
1757
  if (TREE_CODE (base) == PLUS_EXPR)
1758
    {
1759
      tree offset2;
1760
 
1761
      offset2 = TREE_OPERAND (base, 1);
1762
      if (TREE_CODE (offset2) != INTEGER_CST)
1763
        return NULL_TREE;
1764
      base = TREE_OPERAND (base, 0);
1765
 
1766
      offset = int_const_binop (PLUS_EXPR, offset, offset2, 1);
1767
    }
1768
 
1769
  if (TREE_CODE (base) == ADDR_EXPR)
1770
    {
1771
      /* Strip the ADDR_EXPR.  */
1772
      base = TREE_OPERAND (base, 0);
1773
 
1774
      /* Fold away CONST_DECL to its value, if the type is scalar.  */
1775
      if (TREE_CODE (base) == CONST_DECL
1776
          && ccp_decl_initial_min_invariant (DECL_INITIAL (base)))
1777
        return DECL_INITIAL (base);
1778
 
1779
      /* Try folding *(&B+O) to B[X].  */
1780
      t = maybe_fold_offset_to_array_ref (base, offset, TREE_TYPE (expr));
1781
      if (t)
1782
        return t;
1783
 
1784
      /* Try folding *(&B+O) to B.X.  */
1785
      t = maybe_fold_offset_to_component_ref (TREE_TYPE (base), base, offset,
1786
                                              TREE_TYPE (expr), false);
1787
      if (t)
1788
        return t;
1789
 
1790
      /* Fold *&B to B.  We can only do this if EXPR is the same type
1791
         as BASE.  We can't do this if EXPR is the element type of an array
1792
         and BASE is the array.  */
1793
      if (integer_zerop (offset)
1794
          && lang_hooks.types_compatible_p (TREE_TYPE (base),
1795
                                            TREE_TYPE (expr)))
1796
        return base;
1797
    }
1798
  else
1799
    {
1800
      /* We can get here for out-of-range string constant accesses,
1801
         such as "_"[3].  Bail out of the entire substitution search
1802
         and arrange for the entire statement to be replaced by a
1803
         call to __builtin_trap.  In all likelihood this will all be
1804
         constant-folded away, but in the meantime we can't leave with
1805
         something that get_expr_operands can't understand.  */
1806
 
1807
      t = base;
1808
      STRIP_NOPS (t);
1809
      if (TREE_CODE (t) == ADDR_EXPR
1810
          && TREE_CODE (TREE_OPERAND (t, 0)) == STRING_CST)
1811
        {
1812
          /* FIXME: Except that this causes problems elsewhere with dead
1813
             code not being deleted, and we die in the rtl expanders
1814
             because we failed to remove some ssa_name.  In the meantime,
1815
             just return zero.  */
1816
          /* FIXME2: This condition should be signaled by
1817
             fold_read_from_constant_string directly, rather than
1818
             re-checking for it here.  */
1819
          return integer_zero_node;
1820
        }
1821
 
1822
      /* Try folding *(B+O) to B->X.  Still an improvement.  */
1823
      if (POINTER_TYPE_P (TREE_TYPE (base)))
1824
        {
1825
          t = maybe_fold_offset_to_component_ref (TREE_TYPE (TREE_TYPE (base)),
1826
                                                  base, offset,
1827
                                                  TREE_TYPE (expr), true);
1828
          if (t)
1829
            return t;
1830
        }
1831
    }
1832
 
1833
  /* Otherwise we had an offset that we could not simplify.  */
1834
  return NULL_TREE;
1835
}
1836
 
1837
 
1838
/* A subroutine of fold_stmt_r.  EXPR is a PLUS_EXPR.
1839
 
1840
   A quaint feature extant in our address arithmetic is that there
1841
   can be hidden type changes here.  The type of the result need
1842
   not be the same as the type of the input pointer.
1843
 
1844
   What we're after here is an expression of the form
1845
        (T *)(&array + const)
1846
   where the cast doesn't actually exist, but is implicit in the
1847
   type of the PLUS_EXPR.  We'd like to turn this into
1848
        &array[x]
1849
   which may be able to propagate further.  */
1850
 
1851
static tree
1852
maybe_fold_stmt_addition (tree expr)
1853
{
1854
  tree op0 = TREE_OPERAND (expr, 0);
1855
  tree op1 = TREE_OPERAND (expr, 1);
1856
  tree ptr_type = TREE_TYPE (expr);
1857
  tree ptd_type;
1858
  tree t;
1859
  bool subtract = (TREE_CODE (expr) == MINUS_EXPR);
1860
 
1861
  /* We're only interested in pointer arithmetic.  */
1862
  if (!POINTER_TYPE_P (ptr_type))
1863
    return NULL_TREE;
1864
  /* Canonicalize the integral operand to op1.  */
1865
  if (INTEGRAL_TYPE_P (TREE_TYPE (op0)))
1866
    {
1867
      if (subtract)
1868
        return NULL_TREE;
1869
      t = op0, op0 = op1, op1 = t;
1870
    }
1871
  /* It had better be a constant.  */
1872
  if (TREE_CODE (op1) != INTEGER_CST)
1873
    return NULL_TREE;
1874
  /* The first operand should be an ADDR_EXPR.  */
1875
  if (TREE_CODE (op0) != ADDR_EXPR)
1876
    return NULL_TREE;
1877
  op0 = TREE_OPERAND (op0, 0);
1878
 
1879
  /* If the first operand is an ARRAY_REF, expand it so that we can fold
1880
     the offset into it.  */
1881
  while (TREE_CODE (op0) == ARRAY_REF)
1882
    {
1883
      tree array_obj = TREE_OPERAND (op0, 0);
1884
      tree array_idx = TREE_OPERAND (op0, 1);
1885
      tree elt_type = TREE_TYPE (op0);
1886
      tree elt_size = TYPE_SIZE_UNIT (elt_type);
1887
      tree min_idx;
1888
 
1889
      if (TREE_CODE (array_idx) != INTEGER_CST)
1890
        break;
1891
      if (TREE_CODE (elt_size) != INTEGER_CST)
1892
        break;
1893
 
1894
      /* Un-bias the index by the min index of the array type.  */
1895
      min_idx = TYPE_DOMAIN (TREE_TYPE (array_obj));
1896
      if (min_idx)
1897
        {
1898
          min_idx = TYPE_MIN_VALUE (min_idx);
1899
          if (min_idx)
1900
            {
1901
              if (TREE_CODE (min_idx) != INTEGER_CST)
1902
                break;
1903
 
1904
              array_idx = fold_convert (TREE_TYPE (min_idx), array_idx);
1905
              if (!integer_zerop (min_idx))
1906
                array_idx = int_const_binop (MINUS_EXPR, array_idx,
1907
                                             min_idx, 0);
1908
            }
1909
        }
1910
 
1911
      /* Convert the index to a byte offset.  */
1912
      array_idx = fold_convert (sizetype, array_idx);
1913
      array_idx = int_const_binop (MULT_EXPR, array_idx, elt_size, 0);
1914
 
1915
      /* Update the operands for the next round, or for folding.  */
1916
      /* If we're manipulating unsigned types, then folding into negative
1917
         values can produce incorrect results.  Particularly if the type
1918
         is smaller than the width of the pointer.  */
1919
      if (subtract
1920
          && TYPE_UNSIGNED (TREE_TYPE (op1))
1921
          && tree_int_cst_lt (array_idx, op1))
1922
        return NULL;
1923
      op1 = int_const_binop (subtract ? MINUS_EXPR : PLUS_EXPR,
1924
                             array_idx, op1, 0);
1925
      subtract = false;
1926
      op0 = array_obj;
1927
    }
1928
 
1929
  /* If we weren't able to fold the subtraction into another array reference,
1930
     canonicalize the integer for passing to the array and component ref
1931
     simplification functions.  */
1932
  if (subtract)
1933
    {
1934
      if (TYPE_UNSIGNED (TREE_TYPE (op1)))
1935
        return NULL;
1936
      op1 = fold_unary (NEGATE_EXPR, TREE_TYPE (op1), op1);
1937
      /* ??? In theory fold should always produce another integer.  */
1938
      if (op1 == NULL || TREE_CODE (op1) != INTEGER_CST)
1939
        return NULL;
1940
    }
1941
 
1942
  ptd_type = TREE_TYPE (ptr_type);
1943
 
1944
  /* At which point we can try some of the same things as for indirects.  */
1945
  t = maybe_fold_offset_to_array_ref (op0, op1, ptd_type);
1946
  if (!t)
1947
    t = maybe_fold_offset_to_component_ref (TREE_TYPE (op0), op0, op1,
1948
                                            ptd_type, false);
1949
  if (t)
1950
    t = build1 (ADDR_EXPR, ptr_type, t);
1951
 
1952
  return t;
1953
}
1954
 
1955
/* For passing state through walk_tree into fold_stmt_r and its
1956
   children.  */
1957
 
1958
struct fold_stmt_r_data
1959
{
1960
  tree stmt;
1961
  bool *changed_p;
1962
  bool *inside_addr_expr_p;
1963
};
1964
 
1965
/* Subroutine of fold_stmt called via walk_tree.  We perform several
1966
   simplifications of EXPR_P, mostly having to do with pointer arithmetic.  */
1967
 
1968
static tree
1969
fold_stmt_r (tree *expr_p, int *walk_subtrees, void *data)
1970
{
1971
  struct fold_stmt_r_data *fold_stmt_r_data = (struct fold_stmt_r_data *) data;
1972
  bool *inside_addr_expr_p = fold_stmt_r_data->inside_addr_expr_p;
1973
  bool *changed_p = fold_stmt_r_data->changed_p;
1974
  tree expr = *expr_p, t;
1975
 
1976
  /* ??? It'd be nice if walk_tree had a pre-order option.  */
1977
  switch (TREE_CODE (expr))
1978
    {
1979
    case INDIRECT_REF:
1980
      t = walk_tree (&TREE_OPERAND (expr, 0), fold_stmt_r, data, NULL);
1981
      if (t)
1982
        return t;
1983
      *walk_subtrees = 0;
1984
 
1985
      t = maybe_fold_stmt_indirect (expr, TREE_OPERAND (expr, 0),
1986
                                    integer_zero_node);
1987
      break;
1988
 
1989
      /* ??? Could handle more ARRAY_REFs here, as a variant of INDIRECT_REF.
1990
         We'd only want to bother decomposing an existing ARRAY_REF if
1991
         the base array is found to have another offset contained within.
1992
         Otherwise we'd be wasting time.  */
1993
    case ARRAY_REF:
1994
      /* If we are not processing expressions found within an
1995
         ADDR_EXPR, then we can fold constant array references.  */
1996
      if (!*inside_addr_expr_p)
1997
        t = fold_read_from_constant_string (expr);
1998
      else
1999
        t = NULL;
2000
      break;
2001
 
2002
    case ADDR_EXPR:
2003
      *inside_addr_expr_p = true;
2004
      t = walk_tree (&TREE_OPERAND (expr, 0), fold_stmt_r, data, NULL);
2005
      *inside_addr_expr_p = false;
2006
      if (t)
2007
        return t;
2008
      *walk_subtrees = 0;
2009
 
2010
      /* Set TREE_INVARIANT properly so that the value is properly
2011
         considered constant, and so gets propagated as expected.  */
2012
      if (*changed_p)
2013
        recompute_tree_invariant_for_addr_expr (expr);
2014
      return NULL_TREE;
2015
 
2016
    case PLUS_EXPR:
2017
    case MINUS_EXPR:
2018
      t = walk_tree (&TREE_OPERAND (expr, 0), fold_stmt_r, data, NULL);
2019
      if (t)
2020
        return t;
2021
      t = walk_tree (&TREE_OPERAND (expr, 1), fold_stmt_r, data, NULL);
2022
      if (t)
2023
        return t;
2024
      *walk_subtrees = 0;
2025
 
2026
      t = maybe_fold_stmt_addition (expr);
2027
      break;
2028
 
2029
    case COMPONENT_REF:
2030
      t = walk_tree (&TREE_OPERAND (expr, 0), fold_stmt_r, data, NULL);
2031
      if (t)
2032
        return t;
2033
      *walk_subtrees = 0;
2034
 
2035
      /* Make sure the FIELD_DECL is actually a field in the type on the lhs.
2036
         We've already checked that the records are compatible, so we should
2037
         come up with a set of compatible fields.  */
2038
      {
2039
        tree expr_record = TREE_TYPE (TREE_OPERAND (expr, 0));
2040
        tree expr_field = TREE_OPERAND (expr, 1);
2041
 
2042
        if (DECL_FIELD_CONTEXT (expr_field) != TYPE_MAIN_VARIANT (expr_record))
2043
          {
2044
            expr_field = find_compatible_field (expr_record, expr_field);
2045
            TREE_OPERAND (expr, 1) = expr_field;
2046
          }
2047
      }
2048
      break;
2049
 
2050
    case TARGET_MEM_REF:
2051
      t = maybe_fold_tmr (expr);
2052
      break;
2053
 
2054
    case COND_EXPR:
2055
      if (COMPARISON_CLASS_P (TREE_OPERAND (expr, 0)))
2056
        {
2057
          tree op0 = TREE_OPERAND (expr, 0);
2058
          tree tem;
2059
          bool set;
2060
 
2061
          fold_defer_overflow_warnings ();
2062
          tem = fold_binary (TREE_CODE (op0), TREE_TYPE (op0),
2063
                             TREE_OPERAND (op0, 0),
2064
                             TREE_OPERAND (op0, 1));
2065
          set = tem && set_rhs (expr_p, tem);
2066
          fold_undefer_overflow_warnings (set, fold_stmt_r_data->stmt, 0);
2067
          if (set)
2068
            {
2069
              t = *expr_p;
2070
              break;
2071
            }
2072
        }
2073
 
2074
    default:
2075
      return NULL_TREE;
2076
    }
2077
 
2078
  if (t)
2079
    {
2080
      *expr_p = t;
2081
      *changed_p = true;
2082
    }
2083
 
2084
  return NULL_TREE;
2085
}
2086
 
2087
 
2088
/* Return the string length, maximum string length or maximum value of
2089
   ARG in LENGTH.
2090
   If ARG is an SSA name variable, follow its use-def chains.  If LENGTH
2091
   is not NULL and, for TYPE == 0, its value is not equal to the length
2092
   we determine or if we are unable to determine the length or value,
2093
   return false.  VISITED is a bitmap of visited variables.
2094
   TYPE is 0 if string length should be returned, 1 for maximum string
2095
   length and 2 for maximum value ARG can have.  */
2096
 
2097
static bool
2098
get_maxval_strlen (tree arg, tree *length, bitmap visited, int type)
2099
{
2100
  tree var, def_stmt, val;
2101
 
2102
  if (TREE_CODE (arg) != SSA_NAME)
2103
    {
2104
      if (type == 2)
2105
        {
2106
          val = arg;
2107
          if (TREE_CODE (val) != INTEGER_CST
2108
              || tree_int_cst_sgn (val) < 0)
2109
            return false;
2110
        }
2111
      else
2112
        val = c_strlen (arg, 1);
2113
      if (!val)
2114
        return false;
2115
 
2116
      if (*length)
2117
        {
2118
          if (type > 0)
2119
            {
2120
              if (TREE_CODE (*length) != INTEGER_CST
2121
                  || TREE_CODE (val) != INTEGER_CST)
2122
                return false;
2123
 
2124
              if (tree_int_cst_lt (*length, val))
2125
                *length = val;
2126
              return true;
2127
            }
2128
          else if (simple_cst_equal (val, *length) != 1)
2129
            return false;
2130
        }
2131
 
2132
      *length = val;
2133
      return true;
2134
    }
2135
 
2136
  /* If we were already here, break the infinite cycle.  */
2137
  if (bitmap_bit_p (visited, SSA_NAME_VERSION (arg)))
2138
    return true;
2139
  bitmap_set_bit (visited, SSA_NAME_VERSION (arg));
2140
 
2141
  var = arg;
2142
  def_stmt = SSA_NAME_DEF_STMT (var);
2143
 
2144
  switch (TREE_CODE (def_stmt))
2145
    {
2146
      case MODIFY_EXPR:
2147
        {
2148
          tree rhs;
2149
 
2150
          /* The RHS of the statement defining VAR must either have a
2151
             constant length or come from another SSA_NAME with a constant
2152
             length.  */
2153
          rhs = TREE_OPERAND (def_stmt, 1);
2154
          STRIP_NOPS (rhs);
2155
          return get_maxval_strlen (rhs, length, visited, type);
2156
        }
2157
 
2158
      case PHI_NODE:
2159
        {
2160
          /* All the arguments of the PHI node must have the same constant
2161
             length.  */
2162
          int i;
2163
 
2164
          for (i = 0; i < PHI_NUM_ARGS (def_stmt); i++)
2165
            {
2166
              tree arg = PHI_ARG_DEF (def_stmt, i);
2167
 
2168
              /* If this PHI has itself as an argument, we cannot
2169
                 determine the string length of this argument.  However,
2170
                 if we can find a constant string length for the other
2171
                 PHI args then we can still be sure that this is a
2172
                 constant string length.  So be optimistic and just
2173
                 continue with the next argument.  */
2174
              if (arg == PHI_RESULT (def_stmt))
2175
                continue;
2176
 
2177
              if (!get_maxval_strlen (arg, length, visited, type))
2178
                return false;
2179
            }
2180
 
2181
          return true;
2182
        }
2183
 
2184
      default:
2185
        break;
2186
    }
2187
 
2188
 
2189
  return false;
2190
}
2191
 
2192
 
2193
/* Fold builtin call FN in statement STMT.  If it cannot be folded into a
2194
   constant, return NULL_TREE.  Otherwise, return its constant value.  */
2195
 
2196
static tree
2197
ccp_fold_builtin (tree stmt, tree fn)
2198
{
2199
  tree result, val[3];
2200
  tree callee, arglist, a;
2201
  int arg_mask, i, type;
2202
  bitmap visited;
2203
  bool ignore;
2204
 
2205
  ignore = TREE_CODE (stmt) != MODIFY_EXPR;
2206
 
2207
  /* First try the generic builtin folder.  If that succeeds, return the
2208
     result directly.  */
2209
  callee = get_callee_fndecl (fn);
2210
  arglist = TREE_OPERAND (fn, 1);
2211
  result = fold_builtin (callee, arglist, ignore);
2212
  if (result)
2213
    {
2214
      if (ignore)
2215
        STRIP_NOPS (result);
2216
      return result;
2217
    }
2218
 
2219
  /* Ignore MD builtins.  */
2220
  if (DECL_BUILT_IN_CLASS (callee) == BUILT_IN_MD)
2221
    return NULL_TREE;
2222
 
2223
  /* If the builtin could not be folded, and it has no argument list,
2224
     we're done.  */
2225
  if (!arglist)
2226
    return NULL_TREE;
2227
 
2228
  /* Limit the work only for builtins we know how to simplify.  */
2229
  switch (DECL_FUNCTION_CODE (callee))
2230
    {
2231
    case BUILT_IN_STRLEN:
2232
    case BUILT_IN_FPUTS:
2233
    case BUILT_IN_FPUTS_UNLOCKED:
2234
      arg_mask = 1;
2235
      type = 0;
2236
      break;
2237
    case BUILT_IN_STRCPY:
2238
    case BUILT_IN_STRNCPY:
2239
      arg_mask = 2;
2240
      type = 0;
2241
      break;
2242
    case BUILT_IN_MEMCPY_CHK:
2243
    case BUILT_IN_MEMPCPY_CHK:
2244
    case BUILT_IN_MEMMOVE_CHK:
2245
    case BUILT_IN_MEMSET_CHK:
2246
    case BUILT_IN_STRNCPY_CHK:
2247
      arg_mask = 4;
2248
      type = 2;
2249
      break;
2250
    case BUILT_IN_STRCPY_CHK:
2251
    case BUILT_IN_STPCPY_CHK:
2252
      arg_mask = 2;
2253
      type = 1;
2254
      break;
2255
    case BUILT_IN_SNPRINTF_CHK:
2256
    case BUILT_IN_VSNPRINTF_CHK:
2257
      arg_mask = 2;
2258
      type = 2;
2259
      break;
2260
    default:
2261
      return NULL_TREE;
2262
    }
2263
 
2264
  /* Try to use the dataflow information gathered by the CCP process.  */
2265
  visited = BITMAP_ALLOC (NULL);
2266
 
2267
  memset (val, 0, sizeof (val));
2268
  for (i = 0, a = arglist;
2269
       arg_mask;
2270
       i++, arg_mask >>= 1, a = TREE_CHAIN (a))
2271
    if (arg_mask & 1)
2272
      {
2273
        bitmap_clear (visited);
2274
        if (!get_maxval_strlen (TREE_VALUE (a), &val[i], visited, type))
2275
          val[i] = NULL_TREE;
2276
      }
2277
 
2278
  BITMAP_FREE (visited);
2279
 
2280
  result = NULL_TREE;
2281
  switch (DECL_FUNCTION_CODE (callee))
2282
    {
2283
    case BUILT_IN_STRLEN:
2284
      if (val[0])
2285
        {
2286
          tree new = fold_convert (TREE_TYPE (fn), val[0]);
2287
 
2288
          /* If the result is not a valid gimple value, or not a cast
2289
             of a valid gimple value, then we can not use the result.  */
2290
          if (is_gimple_val (new)
2291
              || (is_gimple_cast (new)
2292
                  && is_gimple_val (TREE_OPERAND (new, 0))))
2293
            return new;
2294
        }
2295
      break;
2296
 
2297
    case BUILT_IN_STRCPY:
2298
      if (val[1] && is_gimple_val (val[1]))
2299
        result = fold_builtin_strcpy (callee, arglist, val[1]);
2300
      break;
2301
 
2302
    case BUILT_IN_STRNCPY:
2303
      if (val[1] && is_gimple_val (val[1]))
2304
        result = fold_builtin_strncpy (callee, arglist, val[1]);
2305
      break;
2306
 
2307
    case BUILT_IN_FPUTS:
2308
      result = fold_builtin_fputs (arglist,
2309
                                   TREE_CODE (stmt) != MODIFY_EXPR, 0,
2310
                                   val[0]);
2311
      break;
2312
 
2313
    case BUILT_IN_FPUTS_UNLOCKED:
2314
      result = fold_builtin_fputs (arglist,
2315
                                   TREE_CODE (stmt) != MODIFY_EXPR, 1,
2316
                                   val[0]);
2317
      break;
2318
 
2319
    case BUILT_IN_MEMCPY_CHK:
2320
    case BUILT_IN_MEMPCPY_CHK:
2321
    case BUILT_IN_MEMMOVE_CHK:
2322
    case BUILT_IN_MEMSET_CHK:
2323
      if (val[2] && is_gimple_val (val[2]))
2324
        result = fold_builtin_memory_chk (callee, arglist, val[2], ignore,
2325
                                          DECL_FUNCTION_CODE (callee));
2326
      break;
2327
 
2328
    case BUILT_IN_STRCPY_CHK:
2329
    case BUILT_IN_STPCPY_CHK:
2330
      if (val[1] && is_gimple_val (val[1]))
2331
        result = fold_builtin_stxcpy_chk (callee, arglist, val[1], ignore,
2332
                                          DECL_FUNCTION_CODE (callee));
2333
      break;
2334
 
2335
    case BUILT_IN_STRNCPY_CHK:
2336
      if (val[2] && is_gimple_val (val[2]))
2337
        result = fold_builtin_strncpy_chk (arglist, val[2]);
2338
      break;
2339
 
2340
    case BUILT_IN_SNPRINTF_CHK:
2341
    case BUILT_IN_VSNPRINTF_CHK:
2342
      if (val[1] && is_gimple_val (val[1]))
2343
        result = fold_builtin_snprintf_chk (arglist, val[1],
2344
                                            DECL_FUNCTION_CODE (callee));
2345
      break;
2346
 
2347
    default:
2348
      gcc_unreachable ();
2349
    }
2350
 
2351
  if (result && ignore)
2352
    result = fold_ignored_result (result);
2353
  return result;
2354
}
2355
 
2356
 
2357
/* Fold the statement pointed to by STMT_P.  In some cases, this function may
2358
   replace the whole statement with a new one.  Returns true iff folding
2359
   makes any changes.  */
2360
 
2361
bool
2362
fold_stmt (tree *stmt_p)
2363
{
2364
  tree rhs, result, stmt;
2365
  struct fold_stmt_r_data fold_stmt_r_data;
2366
  bool changed = false;
2367
  bool inside_addr_expr = false;
2368
 
2369
  stmt = *stmt_p;
2370
 
2371
  fold_stmt_r_data.stmt = stmt;
2372
  fold_stmt_r_data.changed_p = &changed;
2373
  fold_stmt_r_data.inside_addr_expr_p = &inside_addr_expr;
2374
 
2375
  /* If we replaced constants and the statement makes pointer dereferences,
2376
     then we may need to fold instances of *&VAR into VAR, etc.  */
2377
  if (walk_tree (stmt_p, fold_stmt_r, &fold_stmt_r_data, NULL))
2378
    {
2379
      *stmt_p
2380
        = build_function_call_expr (implicit_built_in_decls[BUILT_IN_TRAP],
2381
                                    NULL);
2382
      return true;
2383
    }
2384
 
2385
  rhs = get_rhs (stmt);
2386
  if (!rhs)
2387
    return changed;
2388
  result = NULL_TREE;
2389
 
2390
  if (TREE_CODE (rhs) == CALL_EXPR)
2391
    {
2392
      tree callee;
2393
 
2394
      /* Check for builtins that CCP can handle using information not
2395
         available in the generic fold routines.  */
2396
      callee = get_callee_fndecl (rhs);
2397
      if (callee && DECL_BUILT_IN (callee))
2398
        result = ccp_fold_builtin (stmt, rhs);
2399
      else
2400
        {
2401
          /* Check for resolvable OBJ_TYPE_REF.  The only sorts we can resolve
2402
             here are when we've propagated the address of a decl into the
2403
             object slot.  */
2404
          /* ??? Should perhaps do this in fold proper.  However, doing it
2405
             there requires that we create a new CALL_EXPR, and that requires
2406
             copying EH region info to the new node.  Easier to just do it
2407
             here where we can just smash the call operand. Also
2408
             CALL_EXPR_RETURN_SLOT_OPT needs to be handled correctly and
2409
             copied, fold_ternary does not have not information. */
2410
          callee = TREE_OPERAND (rhs, 0);
2411
          if (TREE_CODE (callee) == OBJ_TYPE_REF
2412
              && lang_hooks.fold_obj_type_ref
2413
              && TREE_CODE (OBJ_TYPE_REF_OBJECT (callee)) == ADDR_EXPR
2414
              && DECL_P (TREE_OPERAND
2415
                         (OBJ_TYPE_REF_OBJECT (callee), 0)))
2416
            {
2417
              tree t;
2418
 
2419
              /* ??? Caution: Broken ADDR_EXPR semantics means that
2420
                 looking at the type of the operand of the addr_expr
2421
                 can yield an array type.  See silly exception in
2422
                 check_pointer_types_r.  */
2423
 
2424
              t = TREE_TYPE (TREE_TYPE (OBJ_TYPE_REF_OBJECT (callee)));
2425
              t = lang_hooks.fold_obj_type_ref (callee, t);
2426
              if (t)
2427
                {
2428
                  TREE_OPERAND (rhs, 0) = t;
2429
                  changed = true;
2430
                }
2431
            }
2432
        }
2433
    }
2434
 
2435
  /* If we couldn't fold the RHS, hand over to the generic fold routines.  */
2436
  if (result == NULL_TREE)
2437
    result = fold (rhs);
2438
 
2439
  /* Strip away useless type conversions.  Both the NON_LVALUE_EXPR that
2440
     may have been added by fold, and "useless" type conversions that might
2441
     now be apparent due to propagation.  */
2442
  STRIP_USELESS_TYPE_CONVERSION (result);
2443
 
2444
  if (result != rhs)
2445
    changed |= set_rhs (stmt_p, result);
2446
 
2447
  return changed;
2448
}
2449
 
2450
/* Perform the minimal folding on statement STMT.  Only operations like
2451
   *&x created by constant propagation are handled.  The statement cannot
2452
   be replaced with a new one.  */
2453
 
2454
bool
2455
fold_stmt_inplace (tree stmt)
2456
{
2457
  tree old_stmt = stmt, rhs, new_rhs;
2458
  struct fold_stmt_r_data fold_stmt_r_data;
2459
  bool changed = false;
2460
  bool inside_addr_expr = false;
2461
 
2462
  fold_stmt_r_data.stmt = stmt;
2463
  fold_stmt_r_data.changed_p = &changed;
2464
  fold_stmt_r_data.inside_addr_expr_p = &inside_addr_expr;
2465
 
2466
  walk_tree (&stmt, fold_stmt_r, &fold_stmt_r_data, NULL);
2467
  gcc_assert (stmt == old_stmt);
2468
 
2469
  rhs = get_rhs (stmt);
2470
  if (!rhs || rhs == stmt)
2471
    return changed;
2472
 
2473
  new_rhs = fold (rhs);
2474
  STRIP_USELESS_TYPE_CONVERSION (new_rhs);
2475
  if (new_rhs == rhs)
2476
    return changed;
2477
 
2478
  changed |= set_rhs (&stmt, new_rhs);
2479
  gcc_assert (stmt == old_stmt);
2480
 
2481
  return changed;
2482
}
2483
 
2484
/* Convert EXPR into a GIMPLE value suitable for substitution on the
2485
   RHS of an assignment.  Insert the necessary statements before
2486
   iterator *SI_P.  */
2487
 
2488
static tree
2489
convert_to_gimple_builtin (block_stmt_iterator *si_p, tree expr)
2490
{
2491
  tree_stmt_iterator ti;
2492
  tree stmt = bsi_stmt (*si_p);
2493
  tree tmp, stmts = NULL;
2494
 
2495
  push_gimplify_context ();
2496
  tmp = get_initialized_tmp_var (expr, &stmts, NULL);
2497
  pop_gimplify_context (NULL);
2498
 
2499
  if (EXPR_HAS_LOCATION (stmt))
2500
    annotate_all_with_locus (&stmts, EXPR_LOCATION (stmt));
2501
 
2502
  /* The replacement can expose previously unreferenced variables.  */
2503
  for (ti = tsi_start (stmts); !tsi_end_p (ti); tsi_next (&ti))
2504
    {
2505
      tree new_stmt = tsi_stmt (ti);
2506
      find_new_referenced_vars (tsi_stmt_ptr (ti));
2507
      bsi_insert_before (si_p, new_stmt, BSI_NEW_STMT);
2508
      mark_new_vars_to_rename (bsi_stmt (*si_p));
2509
      bsi_next (si_p);
2510
    }
2511
 
2512
  return tmp;
2513
}
2514
 
2515
 
2516
/* A simple pass that attempts to fold all builtin functions.  This pass
2517
   is run after we've propagated as many constants as we can.  */
2518
 
2519
static unsigned int
2520
execute_fold_all_builtins (void)
2521
{
2522
  bool cfg_changed = false;
2523
  basic_block bb;
2524
  FOR_EACH_BB (bb)
2525
    {
2526
      block_stmt_iterator i;
2527
      for (i = bsi_start (bb); !bsi_end_p (i); )
2528
        {
2529
          tree *stmtp = bsi_stmt_ptr (i);
2530
          tree old_stmt = *stmtp;
2531
          tree call = get_rhs (*stmtp);
2532
          tree callee, result;
2533
          enum built_in_function fcode;
2534
 
2535
          if (!call || TREE_CODE (call) != CALL_EXPR)
2536
            {
2537
              bsi_next (&i);
2538
              continue;
2539
            }
2540
          callee = get_callee_fndecl (call);
2541
          if (!callee || DECL_BUILT_IN_CLASS (callee) != BUILT_IN_NORMAL)
2542
            {
2543
              bsi_next (&i);
2544
              continue;
2545
            }
2546
          fcode = DECL_FUNCTION_CODE (callee);
2547
 
2548
          result = ccp_fold_builtin (*stmtp, call);
2549
          if (!result)
2550
            switch (DECL_FUNCTION_CODE (callee))
2551
              {
2552
              case BUILT_IN_CONSTANT_P:
2553
                /* Resolve __builtin_constant_p.  If it hasn't been
2554
                   folded to integer_one_node by now, it's fairly
2555
                   certain that the value simply isn't constant.  */
2556
                result = integer_zero_node;
2557
                break;
2558
 
2559
              default:
2560
                bsi_next (&i);
2561
                continue;
2562
              }
2563
 
2564
          if (dump_file && (dump_flags & TDF_DETAILS))
2565
            {
2566
              fprintf (dump_file, "Simplified\n  ");
2567
              print_generic_stmt (dump_file, *stmtp, dump_flags);
2568
            }
2569
 
2570
          if (!set_rhs (stmtp, result))
2571
            {
2572
              result = convert_to_gimple_builtin (&i, result);
2573
              if (result)
2574
                {
2575
                  bool ok = set_rhs (stmtp, result);
2576
 
2577
                  gcc_assert (ok);
2578
                }
2579
            }
2580
          mark_new_vars_to_rename (*stmtp);
2581
          if (maybe_clean_or_replace_eh_stmt (old_stmt, *stmtp)
2582
              && tree_purge_dead_eh_edges (bb))
2583
            cfg_changed = true;
2584
 
2585
          if (dump_file && (dump_flags & TDF_DETAILS))
2586
            {
2587
              fprintf (dump_file, "to\n  ");
2588
              print_generic_stmt (dump_file, *stmtp, dump_flags);
2589
              fprintf (dump_file, "\n");
2590
            }
2591
 
2592
          /* Retry the same statement if it changed into another
2593
             builtin, there might be new opportunities now.  */
2594
          call = get_rhs (*stmtp);
2595
          if (!call || TREE_CODE (call) != CALL_EXPR)
2596
            {
2597
              bsi_next (&i);
2598
              continue;
2599
            }
2600
          callee = get_callee_fndecl (call);
2601
          if (!callee
2602
              || DECL_BUILT_IN_CLASS (callee) != BUILT_IN_NORMAL
2603
              || DECL_FUNCTION_CODE (callee) == fcode)
2604
            bsi_next (&i);
2605
        }
2606
    }
2607
 
2608
  /* Delete unreachable blocks.  */
2609
  if (cfg_changed)
2610
    cleanup_tree_cfg ();
2611
  return 0;
2612
}
2613
 
2614
 
2615
struct tree_opt_pass pass_fold_builtins =
2616
{
2617
  "fab",                                /* name */
2618
  NULL,                                 /* gate */
2619
  execute_fold_all_builtins,            /* execute */
2620
  NULL,                                 /* sub */
2621
  NULL,                                 /* next */
2622
  0,                                     /* static_pass_number */
2623
  0,                                     /* tv_id */
2624
  PROP_cfg | PROP_ssa | PROP_alias,     /* properties_required */
2625
  0,                                     /* properties_provided */
2626
  0,                                     /* properties_destroyed */
2627
  0,                                     /* todo_flags_start */
2628
  TODO_dump_func
2629
    | TODO_verify_ssa
2630
    | TODO_update_ssa,                  /* todo_flags_finish */
2631
 
2632
};

powered by: WebSVN 2.1.0

© copyright 1999-2024 OpenCores.org, equivalent to Oliscience, all rights reserved. OpenCores®, registered trademark.