OpenCores
URL https://opencores.org/ocsvn/scarts/scarts/trunk

Subversion Repositories scarts

[/] [scarts/] [trunk/] [toolchain/] [scarts-gcc/] [gcc-4.1.1/] [gcc/] [flow.c] - Blame information for rev 12

Details | Compare with Previous | View Log

Line No. Rev Author Line
1 12 jlechner
/* Data flow analysis for GNU compiler.
2
   Copyright (C) 1987, 1988, 1992, 1993, 1994, 1995, 1996, 1997, 1998,
3
   1999, 2000, 2001, 2002, 2003, 2004, 2005 Free Software Foundation, Inc.
4
 
5
This file is part of GCC.
6
 
7
GCC is free software; you can redistribute it and/or modify it under
8
the terms of the GNU General Public License as published by the Free
9
Software Foundation; either version 2, or (at your option) any later
10
version.
11
 
12
GCC is distributed in the hope that it will be useful, but WITHOUT ANY
13
WARRANTY; without even the implied warranty of MERCHANTABILITY or
14
FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License
15
for more details.
16
 
17
You should have received a copy of the GNU General Public License
18
along with GCC; see the file COPYING.  If not, write to the Free
19
Software Foundation, 51 Franklin Street, Fifth Floor, Boston, MA
20
02110-1301, USA.  */
21
 
22
/* This file contains the data flow analysis pass of the compiler.  It
23
   computes data flow information which tells combine_instructions
24
   which insns to consider combining and controls register allocation.
25
 
26
   Additional data flow information that is too bulky to record is
27
   generated during the analysis, and is used at that time to create
28
   autoincrement and autodecrement addressing.
29
 
30
   The first step is dividing the function into basic blocks.
31
   find_basic_blocks does this.  Then life_analysis determines
32
   where each register is live and where it is dead.
33
 
34
   ** find_basic_blocks **
35
 
36
   find_basic_blocks divides the current function's rtl into basic
37
   blocks and constructs the CFG.  The blocks are recorded in the
38
   basic_block_info array; the CFG exists in the edge structures
39
   referenced by the blocks.
40
 
41
   find_basic_blocks also finds any unreachable loops and deletes them.
42
 
43
   ** life_analysis **
44
 
45
   life_analysis is called immediately after find_basic_blocks.
46
   It uses the basic block information to determine where each
47
   hard or pseudo register is live.
48
 
49
   ** live-register info **
50
 
51
   The information about where each register is live is in two parts:
52
   the REG_NOTES of insns, and the vector basic_block->global_live_at_start.
53
 
54
   basic_block->global_live_at_start has an element for each basic
55
   block, and the element is a bit-vector with a bit for each hard or
56
   pseudo register.  The bit is 1 if the register is live at the
57
   beginning of the basic block.
58
 
59
   Two types of elements can be added to an insn's REG_NOTES.
60
   A REG_DEAD note is added to an insn's REG_NOTES for any register
61
   that meets both of two conditions:  The value in the register is not
62
   needed in subsequent insns and the insn does not replace the value in
63
   the register (in the case of multi-word hard registers, the value in
64
   each register must be replaced by the insn to avoid a REG_DEAD note).
65
 
66
   In the vast majority of cases, an object in a REG_DEAD note will be
67
   used somewhere in the insn.  The (rare) exception to this is if an
68
   insn uses a multi-word hard register and only some of the registers are
69
   needed in subsequent insns.  In that case, REG_DEAD notes will be
70
   provided for those hard registers that are not subsequently needed.
71
   Partial REG_DEAD notes of this type do not occur when an insn sets
72
   only some of the hard registers used in such a multi-word operand;
73
   omitting REG_DEAD notes for objects stored in an insn is optional and
74
   the desire to do so does not justify the complexity of the partial
75
   REG_DEAD notes.
76
 
77
   REG_UNUSED notes are added for each register that is set by the insn
78
   but is unused subsequently (if every register set by the insn is unused
79
   and the insn does not reference memory or have some other side-effect,
80
   the insn is deleted instead).  If only part of a multi-word hard
81
   register is used in a subsequent insn, REG_UNUSED notes are made for
82
   the parts that will not be used.
83
 
84
   To determine which registers are live after any insn, one can
85
   start from the beginning of the basic block and scan insns, noting
86
   which registers are set by each insn and which die there.
87
 
88
   ** Other actions of life_analysis **
89
 
90
   life_analysis sets up the LOG_LINKS fields of insns because the
91
   information needed to do so is readily available.
92
 
93
   life_analysis deletes insns whose only effect is to store a value
94
   that is never used.
95
 
96
   life_analysis notices cases where a reference to a register as
97
   a memory address can be combined with a preceding or following
98
   incrementation or decrementation of the register.  The separate
99
   instruction to increment or decrement is deleted and the address
100
   is changed to a POST_INC or similar rtx.
101
 
102
   Each time an incrementing or decrementing address is created,
103
   a REG_INC element is added to the insn's REG_NOTES list.
104
 
105
   life_analysis fills in certain vectors containing information about
106
   register usage: REG_N_REFS, REG_N_DEATHS, REG_N_SETS, REG_LIVE_LENGTH,
107
   REG_N_CALLS_CROSSED, REG_N_THROWING_CALLS_CROSSED and REG_BASIC_BLOCK.
108
 
109
   life_analysis sets current_function_sp_is_unchanging if the function
110
   doesn't modify the stack pointer.  */
111
 
112
/* TODO:
113
 
114
   Split out from life_analysis:
115
        - local property discovery
116
        - global property computation
117
        - log links creation
118
        - pre/post modify transformation
119
*/
120
 
121
#include "config.h"
122
#include "system.h"
123
#include "coretypes.h"
124
#include "tm.h"
125
#include "tree.h"
126
#include "rtl.h"
127
#include "tm_p.h"
128
#include "hard-reg-set.h"
129
#include "basic-block.h"
130
#include "insn-config.h"
131
#include "regs.h"
132
#include "flags.h"
133
#include "output.h"
134
#include "function.h"
135
#include "except.h"
136
#include "toplev.h"
137
#include "recog.h"
138
#include "expr.h"
139
#include "timevar.h"
140
 
141
#include "obstack.h"
142
#include "splay-tree.h"
143
#include "tree-pass.h"
144
#include "params.h"
145
 
146
#ifndef HAVE_epilogue
147
#define HAVE_epilogue 0
148
#endif
149
#ifndef HAVE_prologue
150
#define HAVE_prologue 0
151
#endif
152
#ifndef HAVE_sibcall_epilogue
153
#define HAVE_sibcall_epilogue 0
154
#endif
155
 
156
#ifndef EPILOGUE_USES
157
#define EPILOGUE_USES(REGNO)  0
158
#endif
159
#ifndef EH_USES
160
#define EH_USES(REGNO)  0
161
#endif
162
 
163
#ifdef HAVE_conditional_execution
164
#ifndef REVERSE_CONDEXEC_PREDICATES_P
165
#define REVERSE_CONDEXEC_PREDICATES_P(x, y) \
166
  (GET_CODE ((x)) == reversed_comparison_code ((y), NULL))
167
#endif
168
#endif
169
 
170
/* This is the maximum number of times we process any given block if the
171
   latest loop depth count is smaller than this number.  Only used for the
172
   failure strategy to avoid infinite loops in calculate_global_regs_live.  */
173
#define MAX_LIVENESS_ROUNDS 20
174
 
175
/* Nonzero if the second flow pass has completed.  */
176
int flow2_completed;
177
 
178
/* Maximum register number used in this function, plus one.  */
179
 
180
int max_regno;
181
 
182
/* Indexed by n, giving various register information */
183
 
184
varray_type reg_n_info;
185
 
186
/* Regset of regs live when calls to `setjmp'-like functions happen.  */
187
/* ??? Does this exist only for the setjmp-clobbered warning message?  */
188
 
189
static regset regs_live_at_setjmp;
190
 
191
/* List made of EXPR_LIST rtx's which gives pairs of pseudo registers
192
   that have to go in the same hard reg.
193
   The first two regs in the list are a pair, and the next two
194
   are another pair, etc.  */
195
rtx regs_may_share;
196
 
197
/* Set of registers that may be eliminable.  These are handled specially
198
   in updating regs_ever_live.  */
199
 
200
static HARD_REG_SET elim_reg_set;
201
 
202
/* Holds information for tracking conditional register life information.  */
203
struct reg_cond_life_info
204
{
205
  /* A boolean expression of conditions under which a register is dead.  */
206
  rtx condition;
207
  /* Conditions under which a register is dead at the basic block end.  */
208
  rtx orig_condition;
209
 
210
  /* A boolean expression of conditions under which a register has been
211
     stored into.  */
212
  rtx stores;
213
 
214
  /* ??? Could store mask of bytes that are dead, so that we could finally
215
     track lifetimes of multi-word registers accessed via subregs.  */
216
};
217
 
218
/* For use in communicating between propagate_block and its subroutines.
219
   Holds all information needed to compute life and def-use information.  */
220
 
221
struct propagate_block_info
222
{
223
  /* The basic block we're considering.  */
224
  basic_block bb;
225
 
226
  /* Bit N is set if register N is conditionally or unconditionally live.  */
227
  regset reg_live;
228
 
229
  /* Bit N is set if register N is set this insn.  */
230
  regset new_set;
231
 
232
  /* Element N is the next insn that uses (hard or pseudo) register N
233
     within the current basic block; or zero, if there is no such insn.  */
234
  rtx *reg_next_use;
235
 
236
  /* Contains a list of all the MEMs we are tracking for dead store
237
     elimination.  */
238
  rtx mem_set_list;
239
 
240
  /* If non-null, record the set of registers set unconditionally in the
241
     basic block.  */
242
  regset local_set;
243
 
244
  /* If non-null, record the set of registers set conditionally in the
245
     basic block.  */
246
  regset cond_local_set;
247
 
248
#ifdef HAVE_conditional_execution
249
  /* Indexed by register number, holds a reg_cond_life_info for each
250
     register that is not unconditionally live or dead.  */
251
  splay_tree reg_cond_dead;
252
 
253
  /* Bit N is set if register N is in an expression in reg_cond_dead.  */
254
  regset reg_cond_reg;
255
#endif
256
 
257
  /* The length of mem_set_list.  */
258
  int mem_set_list_len;
259
 
260
  /* Nonzero if the value of CC0 is live.  */
261
  int cc0_live;
262
 
263
  /* Flags controlling the set of information propagate_block collects.  */
264
  int flags;
265
  /* Index of instruction being processed.  */
266
  int insn_num;
267
};
268
 
269
/* Number of dead insns removed.  */
270
static int ndead;
271
 
272
/* When PROP_REG_INFO set, array contains pbi->insn_num of instruction
273
   where given register died.  When the register is marked alive, we use the
274
   information to compute amount of instructions life range cross.
275
   (remember, we are walking backward).  This can be computed as current
276
   pbi->insn_num - reg_deaths[regno].
277
   At the end of processing each basic block, the remaining live registers
278
   are inspected and live ranges are increased same way so liverange of global
279
   registers are computed correctly.
280
 
281
   The array is maintained clear for dead registers, so it can be safely reused
282
   for next basic block without expensive memset of the whole array after
283
   reseting pbi->insn_num to 0.  */
284
 
285
static int *reg_deaths;
286
 
287
/* Forward declarations */
288
static int verify_wide_reg_1 (rtx *, void *);
289
static void verify_wide_reg (int, basic_block);
290
static void verify_local_live_at_start (regset, basic_block);
291
static void notice_stack_pointer_modification_1 (rtx, rtx, void *);
292
static void notice_stack_pointer_modification (void);
293
static void mark_reg (rtx, void *);
294
static void mark_regs_live_at_end (regset);
295
static void calculate_global_regs_live (sbitmap, sbitmap, int);
296
static void propagate_block_delete_insn (rtx);
297
static rtx propagate_block_delete_libcall (rtx, rtx);
298
static int insn_dead_p (struct propagate_block_info *, rtx, int, rtx);
299
static int libcall_dead_p (struct propagate_block_info *, rtx, rtx);
300
static void mark_set_regs (struct propagate_block_info *, rtx, rtx);
301
static void mark_set_1 (struct propagate_block_info *, enum rtx_code, rtx,
302
                        rtx, rtx, int);
303
static int find_regno_partial (rtx *, void *);
304
 
305
#ifdef HAVE_conditional_execution
306
static int mark_regno_cond_dead (struct propagate_block_info *, int, rtx);
307
static void free_reg_cond_life_info (splay_tree_value);
308
static int flush_reg_cond_reg_1 (splay_tree_node, void *);
309
static void flush_reg_cond_reg (struct propagate_block_info *, int);
310
static rtx elim_reg_cond (rtx, unsigned int);
311
static rtx ior_reg_cond (rtx, rtx, int);
312
static rtx not_reg_cond (rtx);
313
static rtx and_reg_cond (rtx, rtx, int);
314
#endif
315
#ifdef AUTO_INC_DEC
316
static void attempt_auto_inc (struct propagate_block_info *, rtx, rtx, rtx,
317
                              rtx, rtx);
318
static void find_auto_inc (struct propagate_block_info *, rtx, rtx);
319
static int try_pre_increment_1 (struct propagate_block_info *, rtx);
320
static int try_pre_increment (rtx, rtx, HOST_WIDE_INT);
321
#endif
322
static void mark_used_reg (struct propagate_block_info *, rtx, rtx, rtx);
323
static void mark_used_regs (struct propagate_block_info *, rtx, rtx, rtx);
324
void debug_flow_info (void);
325
static void add_to_mem_set_list (struct propagate_block_info *, rtx);
326
static int invalidate_mems_from_autoinc (rtx *, void *);
327
static void invalidate_mems_from_set (struct propagate_block_info *, rtx);
328
static void clear_log_links (sbitmap);
329
static int count_or_remove_death_notes_bb (basic_block, int);
330
static void allocate_bb_life_data (void);
331
 
332
/* Return the INSN immediately following the NOTE_INSN_BASIC_BLOCK
333
   note associated with the BLOCK.  */
334
 
335
rtx
336
first_insn_after_basic_block_note (basic_block block)
337
{
338
  rtx insn;
339
 
340
  /* Get the first instruction in the block.  */
341
  insn = BB_HEAD (block);
342
 
343
  if (insn == NULL_RTX)
344
    return NULL_RTX;
345
  if (LABEL_P (insn))
346
    insn = NEXT_INSN (insn);
347
  gcc_assert (NOTE_INSN_BASIC_BLOCK_P (insn));
348
 
349
  return NEXT_INSN (insn);
350
}
351
 
352
/* Perform data flow analysis for the whole control flow graph.
353
   FLAGS is a set of PROP_* flags to be used in accumulating flow info.  */
354
 
355
void
356
life_analysis (FILE *file, int flags)
357
{
358
#ifdef ELIMINABLE_REGS
359
  int i;
360
  static const struct {const int from, to; } eliminables[] = ELIMINABLE_REGS;
361
#endif
362
 
363
  /* Record which registers will be eliminated.  We use this in
364
     mark_used_regs.  */
365
 
366
  CLEAR_HARD_REG_SET (elim_reg_set);
367
 
368
#ifdef ELIMINABLE_REGS
369
  for (i = 0; i < (int) ARRAY_SIZE (eliminables); i++)
370
    SET_HARD_REG_BIT (elim_reg_set, eliminables[i].from);
371
#else
372
  SET_HARD_REG_BIT (elim_reg_set, FRAME_POINTER_REGNUM);
373
#endif
374
 
375
 
376
#ifdef CANNOT_CHANGE_MODE_CLASS
377
  if (flags & PROP_REG_INFO)
378
    init_subregs_of_mode ();
379
#endif
380
 
381
  if (! optimize)
382
    flags &= ~(PROP_LOG_LINKS | PROP_AUTOINC | PROP_ALLOW_CFG_CHANGES);
383
 
384
  /* The post-reload life analysis have (on a global basis) the same
385
     registers live as was computed by reload itself.  elimination
386
     Otherwise offsets and such may be incorrect.
387
 
388
     Reload will make some registers as live even though they do not
389
     appear in the rtl.
390
 
391
     We don't want to create new auto-incs after reload, since they
392
     are unlikely to be useful and can cause problems with shared
393
     stack slots.  */
394
  if (reload_completed)
395
    flags &= ~(PROP_REG_INFO | PROP_AUTOINC);
396
 
397
  /* We want alias analysis information for local dead store elimination.  */
398
  if (optimize && (flags & PROP_SCAN_DEAD_STORES))
399
    init_alias_analysis ();
400
 
401
  /* Always remove no-op moves.  Do this before other processing so
402
     that we don't have to keep re-scanning them.  */
403
  delete_noop_moves ();
404
 
405
  /* Some targets can emit simpler epilogues if they know that sp was
406
     not ever modified during the function.  After reload, of course,
407
     we've already emitted the epilogue so there's no sense searching.  */
408
  if (! reload_completed)
409
    notice_stack_pointer_modification ();
410
 
411
  /* Allocate and zero out data structures that will record the
412
     data from lifetime analysis.  */
413
  allocate_reg_life_data ();
414
  allocate_bb_life_data ();
415
 
416
  /* Find the set of registers live on function exit.  */
417
  mark_regs_live_at_end (EXIT_BLOCK_PTR->il.rtl->global_live_at_start);
418
 
419
  /* "Update" life info from zero.  It'd be nice to begin the
420
     relaxation with just the exit and noreturn blocks, but that set
421
     is not immediately handy.  */
422
 
423
  if (flags & PROP_REG_INFO)
424
    {
425
      memset (regs_ever_live, 0, sizeof (regs_ever_live));
426
      memset (regs_asm_clobbered, 0, sizeof (regs_asm_clobbered));
427
    }
428
  update_life_info (NULL, UPDATE_LIFE_GLOBAL, flags);
429
  if (reg_deaths)
430
    {
431
      free (reg_deaths);
432
      reg_deaths = NULL;
433
    }
434
 
435
  /* Clean up.  */
436
  if (optimize && (flags & PROP_SCAN_DEAD_STORES))
437
    end_alias_analysis ();
438
 
439
  if (file)
440
    dump_flow_info (file);
441
 
442
  /* Removing dead insns should have made jumptables really dead.  */
443
  delete_dead_jumptables ();
444
}
445
 
446
/* A subroutine of verify_wide_reg, called through for_each_rtx.
447
   Search for REGNO.  If found, return 2 if it is not wider than
448
   word_mode.  */
449
 
450
static int
451
verify_wide_reg_1 (rtx *px, void *pregno)
452
{
453
  rtx x = *px;
454
  unsigned int regno = *(int *) pregno;
455
 
456
  if (REG_P (x) && REGNO (x) == regno)
457
    {
458
      if (GET_MODE_BITSIZE (GET_MODE (x)) <= BITS_PER_WORD)
459
        return 2;
460
      return 1;
461
    }
462
  return 0;
463
}
464
 
465
/* A subroutine of verify_local_live_at_start.  Search through insns
466
   of BB looking for register REGNO.  */
467
 
468
static void
469
verify_wide_reg (int regno, basic_block bb)
470
{
471
  rtx head = BB_HEAD (bb), end = BB_END (bb);
472
 
473
  while (1)
474
    {
475
      if (INSN_P (head))
476
        {
477
          int r = for_each_rtx (&PATTERN (head), verify_wide_reg_1, &regno);
478
          if (r == 1)
479
            return;
480
          if (r == 2)
481
            break;
482
        }
483
      if (head == end)
484
        break;
485
      head = NEXT_INSN (head);
486
    }
487
  if (dump_file)
488
    {
489
      fprintf (dump_file, "Register %d died unexpectedly.\n", regno);
490
      dump_bb (bb, dump_file, 0);
491
    }
492
  fatal_error ("internal consistency failure");
493
}
494
 
495
/* A subroutine of update_life_info.  Verify that there are no untoward
496
   changes in live_at_start during a local update.  */
497
 
498
static void
499
verify_local_live_at_start (regset new_live_at_start, basic_block bb)
500
{
501
  if (reload_completed)
502
    {
503
      /* After reload, there are no pseudos, nor subregs of multi-word
504
         registers.  The regsets should exactly match.  */
505
      if (! REG_SET_EQUAL_P (new_live_at_start,
506
                             bb->il.rtl->global_live_at_start))
507
        {
508
          if (dump_file)
509
            {
510
              fprintf (dump_file,
511
                       "live_at_start mismatch in bb %d, aborting\nNew:\n",
512
                       bb->index);
513
              debug_bitmap_file (dump_file, new_live_at_start);
514
              fputs ("Old:\n", dump_file);
515
              dump_bb (bb, dump_file, 0);
516
            }
517
          fatal_error ("internal consistency failure");
518
        }
519
    }
520
  else
521
    {
522
      unsigned i;
523
      reg_set_iterator rsi;
524
 
525
      /* Find the set of changed registers.  */
526
      XOR_REG_SET (new_live_at_start, bb->il.rtl->global_live_at_start);
527
 
528
      EXECUTE_IF_SET_IN_REG_SET (new_live_at_start, 0, i, rsi)
529
        {
530
          /* No registers should die.  */
531
          if (REGNO_REG_SET_P (bb->il.rtl->global_live_at_start, i))
532
            {
533
              if (dump_file)
534
                {
535
                  fprintf (dump_file,
536
                           "Register %d died unexpectedly.\n", i);
537
                  dump_bb (bb, dump_file, 0);
538
                }
539
              fatal_error ("internal consistency failure");
540
            }
541
          /* Verify that the now-live register is wider than word_mode.  */
542
          verify_wide_reg (i, bb);
543
        }
544
    }
545
}
546
 
547
/* Updates life information starting with the basic blocks set in BLOCKS.
548
   If BLOCKS is null, consider it to be the universal set.
549
 
550
   If EXTENT is UPDATE_LIFE_LOCAL, such as after splitting or peepholing,
551
   we are only expecting local modifications to basic blocks.  If we find
552
   extra registers live at the beginning of a block, then we either killed
553
   useful data, or we have a broken split that wants data not provided.
554
   If we find registers removed from live_at_start, that means we have
555
   a broken peephole that is killing a register it shouldn't.
556
 
557
   ??? This is not true in one situation -- when a pre-reload splitter
558
   generates subregs of a multi-word pseudo, current life analysis will
559
   lose the kill.  So we _can_ have a pseudo go live.  How irritating.
560
 
561
   It is also not true when a peephole decides that it doesn't need one
562
   or more of the inputs.
563
 
564
   Including PROP_REG_INFO does not properly refresh regs_ever_live
565
   unless the caller resets it to zero.  */
566
 
567
int
568
update_life_info (sbitmap blocks, enum update_life_extent extent,
569
                  int prop_flags)
570
{
571
  regset tmp;
572
  unsigned i = 0;
573
  int stabilized_prop_flags = prop_flags;
574
  basic_block bb;
575
 
576
  tmp = ALLOC_REG_SET (&reg_obstack);
577
  ndead = 0;
578
 
579
  if ((prop_flags & PROP_REG_INFO) && !reg_deaths)
580
    reg_deaths = xcalloc (sizeof (*reg_deaths), max_regno);
581
 
582
  timevar_push ((extent == UPDATE_LIFE_LOCAL || blocks)
583
                ? TV_LIFE_UPDATE : TV_LIFE);
584
 
585
  /* Changes to the CFG are only allowed when
586
     doing a global update for the entire CFG.  */
587
  gcc_assert (!(prop_flags & PROP_ALLOW_CFG_CHANGES)
588
              || (extent != UPDATE_LIFE_LOCAL && !blocks));
589
 
590
  /* For a global update, we go through the relaxation process again.  */
591
  if (extent != UPDATE_LIFE_LOCAL)
592
    {
593
      for ( ; ; )
594
        {
595
          int changed = 0;
596
 
597
          calculate_global_regs_live (blocks, blocks,
598
                                prop_flags & (PROP_SCAN_DEAD_CODE
599
                                              | PROP_SCAN_DEAD_STORES
600
                                              | PROP_ALLOW_CFG_CHANGES));
601
 
602
          if ((prop_flags & (PROP_KILL_DEAD_CODE | PROP_ALLOW_CFG_CHANGES))
603
              != (PROP_KILL_DEAD_CODE | PROP_ALLOW_CFG_CHANGES))
604
            break;
605
 
606
          /* Removing dead code may allow the CFG to be simplified which
607
             in turn may allow for further dead code detection / removal.  */
608
          FOR_EACH_BB_REVERSE (bb)
609
            {
610
              COPY_REG_SET (tmp, bb->il.rtl->global_live_at_end);
611
              changed |= propagate_block (bb, tmp, NULL, NULL,
612
                                prop_flags & (PROP_SCAN_DEAD_CODE
613
                                              | PROP_SCAN_DEAD_STORES
614
                                              | PROP_KILL_DEAD_CODE));
615
            }
616
 
617
          /* Don't pass PROP_SCAN_DEAD_CODE or PROP_KILL_DEAD_CODE to
618
             subsequent propagate_block calls, since removing or acting as
619
             removing dead code can affect global register liveness, which
620
             is supposed to be finalized for this call after this loop.  */
621
          stabilized_prop_flags
622
            &= ~(PROP_SCAN_DEAD_CODE | PROP_SCAN_DEAD_STORES
623
                 | PROP_KILL_DEAD_CODE);
624
 
625
          if (! changed)
626
            break;
627
 
628
          /* We repeat regardless of what cleanup_cfg says.  If there were
629
             instructions deleted above, that might have been only a
630
             partial improvement (see PARAM_MAX_FLOW_MEMORY_LOCATIONS  usage).
631
             Further improvement may be possible.  */
632
          cleanup_cfg (CLEANUP_EXPENSIVE);
633
 
634
          /* Zap the life information from the last round.  If we don't
635
             do this, we can wind up with registers that no longer appear
636
             in the code being marked live at entry.  */
637
          FOR_EACH_BB (bb)
638
            {
639
              CLEAR_REG_SET (bb->il.rtl->global_live_at_start);
640
              CLEAR_REG_SET (bb->il.rtl->global_live_at_end);
641
            }
642
        }
643
 
644
      /* If asked, remove notes from the blocks we'll update.  */
645
      if (extent == UPDATE_LIFE_GLOBAL_RM_NOTES)
646
        count_or_remove_death_notes (blocks, 1);
647
    }
648
 
649
  /* Clear log links in case we are asked to (re)compute them.  */
650
  if (prop_flags & PROP_LOG_LINKS)
651
    clear_log_links (blocks);
652
 
653
  if (blocks)
654
    {
655
      sbitmap_iterator sbi;
656
 
657
      EXECUTE_IF_SET_IN_SBITMAP (blocks, 0, i, sbi)
658
        {
659
          bb = BASIC_BLOCK (i);
660
 
661
          COPY_REG_SET (tmp, bb->il.rtl->global_live_at_end);
662
          propagate_block (bb, tmp, NULL, NULL, stabilized_prop_flags);
663
 
664
          if (extent == UPDATE_LIFE_LOCAL)
665
            verify_local_live_at_start (tmp, bb);
666
        };
667
    }
668
  else
669
    {
670
      FOR_EACH_BB_REVERSE (bb)
671
        {
672
          COPY_REG_SET (tmp, bb->il.rtl->global_live_at_end);
673
 
674
          propagate_block (bb, tmp, NULL, NULL, stabilized_prop_flags);
675
 
676
          if (extent == UPDATE_LIFE_LOCAL)
677
            verify_local_live_at_start (tmp, bb);
678
        }
679
    }
680
 
681
  FREE_REG_SET (tmp);
682
 
683
  if (prop_flags & PROP_REG_INFO)
684
    {
685
      reg_set_iterator rsi;
686
 
687
      /* The only pseudos that are live at the beginning of the function
688
         are those that were not set anywhere in the function.  local-alloc
689
         doesn't know how to handle these correctly, so mark them as not
690
         local to any one basic block.  */
691
      EXECUTE_IF_SET_IN_REG_SET (ENTRY_BLOCK_PTR->il.rtl->global_live_at_end,
692
                                 FIRST_PSEUDO_REGISTER, i, rsi)
693
        REG_BASIC_BLOCK (i) = REG_BLOCK_GLOBAL;
694
 
695
      /* We have a problem with any pseudoreg that lives across the setjmp.
696
         ANSI says that if a user variable does not change in value between
697
         the setjmp and the longjmp, then the longjmp preserves it.  This
698
         includes longjmp from a place where the pseudo appears dead.
699
         (In principle, the value still exists if it is in scope.)
700
         If the pseudo goes in a hard reg, some other value may occupy
701
         that hard reg where this pseudo is dead, thus clobbering the pseudo.
702
         Conclusion: such a pseudo must not go in a hard reg.  */
703
      EXECUTE_IF_SET_IN_REG_SET (regs_live_at_setjmp,
704
                                 FIRST_PSEUDO_REGISTER, i, rsi)
705
        {
706
          if (regno_reg_rtx[i] != 0)
707
            {
708
              REG_LIVE_LENGTH (i) = -1;
709
              REG_BASIC_BLOCK (i) = REG_BLOCK_UNKNOWN;
710
            }
711
        }
712
    }
713
  if (reg_deaths)
714
    {
715
      free (reg_deaths);
716
      reg_deaths = NULL;
717
    }
718
  timevar_pop ((extent == UPDATE_LIFE_LOCAL || blocks)
719
               ? TV_LIFE_UPDATE : TV_LIFE);
720
  if (ndead && dump_file)
721
    fprintf (dump_file, "deleted %i dead insns\n", ndead);
722
  return ndead;
723
}
724
 
725
/* Update life information in all blocks where BB_DIRTY is set.  */
726
 
727
int
728
update_life_info_in_dirty_blocks (enum update_life_extent extent, int prop_flags)
729
{
730
  sbitmap update_life_blocks = sbitmap_alloc (last_basic_block);
731
  int n = 0;
732
  basic_block bb;
733
  int retval = 0;
734
 
735
  sbitmap_zero (update_life_blocks);
736
  FOR_EACH_BB (bb)
737
    {
738
      if (bb->flags & BB_DIRTY)
739
        {
740
          SET_BIT (update_life_blocks, bb->index);
741
          n++;
742
        }
743
    }
744
 
745
  if (n)
746
    retval = update_life_info (update_life_blocks, extent, prop_flags);
747
 
748
  sbitmap_free (update_life_blocks);
749
  return retval;
750
}
751
 
752
/* Free the variables allocated by find_basic_blocks.  */
753
 
754
void
755
free_basic_block_vars (void)
756
{
757
  if (basic_block_info)
758
    {
759
      clear_edges ();
760
      basic_block_info = NULL;
761
    }
762
  n_basic_blocks = 0;
763
  last_basic_block = 0;
764
  n_edges = 0;
765
 
766
  label_to_block_map = NULL;
767
 
768
  ENTRY_BLOCK_PTR->aux = NULL;
769
  ENTRY_BLOCK_PTR->il.rtl->global_live_at_end = NULL;
770
  EXIT_BLOCK_PTR->aux = NULL;
771
  EXIT_BLOCK_PTR->il.rtl->global_live_at_start = NULL;
772
}
773
 
774
/* Delete any insns that copy a register to itself.  */
775
 
776
int
777
delete_noop_moves (void)
778
{
779
  rtx insn, next;
780
  basic_block bb;
781
  int nnoops = 0;
782
 
783
  FOR_EACH_BB (bb)
784
    {
785
      for (insn = BB_HEAD (bb); insn != NEXT_INSN (BB_END (bb)); insn = next)
786
        {
787
          next = NEXT_INSN (insn);
788
          if (INSN_P (insn) && noop_move_p (insn))
789
            {
790
              rtx note;
791
 
792
              /* If we're about to remove the first insn of a libcall
793
                 then move the libcall note to the next real insn and
794
                 update the retval note.  */
795
              if ((note = find_reg_note (insn, REG_LIBCALL, NULL_RTX))
796
                       && XEXP (note, 0) != insn)
797
                {
798
                  rtx new_libcall_insn = next_real_insn (insn);
799
                  rtx retval_note = find_reg_note (XEXP (note, 0),
800
                                                   REG_RETVAL, NULL_RTX);
801
                  REG_NOTES (new_libcall_insn)
802
                    = gen_rtx_INSN_LIST (REG_LIBCALL, XEXP (note, 0),
803
                                         REG_NOTES (new_libcall_insn));
804
                  XEXP (retval_note, 0) = new_libcall_insn;
805
                }
806
 
807
              delete_insn_and_edges (insn);
808
              nnoops++;
809
            }
810
        }
811
    }
812
  if (nnoops && dump_file)
813
    fprintf (dump_file, "deleted %i noop moves", nnoops);
814
  return nnoops;
815
}
816
 
817
/* Delete any jump tables never referenced.  We can't delete them at the
818
   time of removing tablejump insn as they are referenced by the preceding
819
   insns computing the destination, so we delay deleting and garbagecollect
820
   them once life information is computed.  */
821
void
822
delete_dead_jumptables (void)
823
{
824
  basic_block bb;
825
 
826
  /* A dead jump table does not belong to any basic block.  Scan insns
827
     between two adjacent basic blocks.  */
828
  FOR_EACH_BB (bb)
829
    {
830
      rtx insn, next;
831
 
832
      for (insn = NEXT_INSN (BB_END (bb));
833
           insn && !NOTE_INSN_BASIC_BLOCK_P (insn);
834
           insn = next)
835
        {
836
          next = NEXT_INSN (insn);
837
          if (LABEL_P (insn)
838
              && LABEL_NUSES (insn) == LABEL_PRESERVE_P (insn)
839
              && JUMP_P (next)
840
              && (GET_CODE (PATTERN (next)) == ADDR_VEC
841
                  || GET_CODE (PATTERN (next)) == ADDR_DIFF_VEC))
842
            {
843
              rtx label = insn, jump = next;
844
 
845
              if (dump_file)
846
                fprintf (dump_file, "Dead jumptable %i removed\n",
847
                         INSN_UID (insn));
848
 
849
              next = NEXT_INSN (next);
850
              delete_insn (jump);
851
              delete_insn (label);
852
            }
853
        }
854
    }
855
}
856
 
857
/* Determine if the stack pointer is constant over the life of the function.
858
   Only useful before prologues have been emitted.  */
859
 
860
static void
861
notice_stack_pointer_modification_1 (rtx x, rtx pat ATTRIBUTE_UNUSED,
862
                                     void *data ATTRIBUTE_UNUSED)
863
{
864
  if (x == stack_pointer_rtx
865
      /* The stack pointer is only modified indirectly as the result
866
         of a push until later in flow.  See the comments in rtl.texi
867
         regarding Embedded Side-Effects on Addresses.  */
868
      || (MEM_P (x)
869
          && GET_RTX_CLASS (GET_CODE (XEXP (x, 0))) == RTX_AUTOINC
870
          && XEXP (XEXP (x, 0), 0) == stack_pointer_rtx))
871
    current_function_sp_is_unchanging = 0;
872
}
873
 
874
static void
875
notice_stack_pointer_modification (void)
876
{
877
  basic_block bb;
878
  rtx insn;
879
 
880
  /* Assume that the stack pointer is unchanging if alloca hasn't
881
     been used.  */
882
  current_function_sp_is_unchanging = !current_function_calls_alloca;
883
  if (! current_function_sp_is_unchanging)
884
    return;
885
 
886
  FOR_EACH_BB (bb)
887
    FOR_BB_INSNS (bb, insn)
888
      {
889
        if (INSN_P (insn))
890
          {
891
            /* Check if insn modifies the stack pointer.  */
892
            note_stores (PATTERN (insn),
893
                         notice_stack_pointer_modification_1,
894
                         NULL);
895
            if (! current_function_sp_is_unchanging)
896
              return;
897
          }
898
      }
899
}
900
 
901
/* Mark a register in SET.  Hard registers in large modes get all
902
   of their component registers set as well.  */
903
 
904
static void
905
mark_reg (rtx reg, void *xset)
906
{
907
  regset set = (regset) xset;
908
  int regno = REGNO (reg);
909
 
910
  gcc_assert (GET_MODE (reg) != BLKmode);
911
 
912
  SET_REGNO_REG_SET (set, regno);
913
  if (regno < FIRST_PSEUDO_REGISTER)
914
    {
915
      int n = hard_regno_nregs[regno][GET_MODE (reg)];
916
      while (--n > 0)
917
        SET_REGNO_REG_SET (set, regno + n);
918
    }
919
}
920
 
921
/* Mark those regs which are needed at the end of the function as live
922
   at the end of the last basic block.  */
923
 
924
static void
925
mark_regs_live_at_end (regset set)
926
{
927
  unsigned int i;
928
 
929
  /* If exiting needs the right stack value, consider the stack pointer
930
     live at the end of the function.  */
931
  if ((HAVE_epilogue && epilogue_completed)
932
      || ! EXIT_IGNORE_STACK
933
      || (! FRAME_POINTER_REQUIRED
934
          && ! current_function_calls_alloca
935
          && flag_omit_frame_pointer)
936
      || current_function_sp_is_unchanging)
937
    {
938
      SET_REGNO_REG_SET (set, STACK_POINTER_REGNUM);
939
    }
940
 
941
  /* Mark the frame pointer if needed at the end of the function.  If
942
     we end up eliminating it, it will be removed from the live list
943
     of each basic block by reload.  */
944
 
945
  if (! reload_completed || frame_pointer_needed)
946
    {
947
      SET_REGNO_REG_SET (set, FRAME_POINTER_REGNUM);
948
#if FRAME_POINTER_REGNUM != HARD_FRAME_POINTER_REGNUM
949
      /* If they are different, also mark the hard frame pointer as live.  */
950
      if (! LOCAL_REGNO (HARD_FRAME_POINTER_REGNUM))
951
        SET_REGNO_REG_SET (set, HARD_FRAME_POINTER_REGNUM);
952
#endif
953
    }
954
 
955
#ifndef PIC_OFFSET_TABLE_REG_CALL_CLOBBERED
956
  /* Many architectures have a GP register even without flag_pic.
957
     Assume the pic register is not in use, or will be handled by
958
     other means, if it is not fixed.  */
959
  if ((unsigned) PIC_OFFSET_TABLE_REGNUM != INVALID_REGNUM
960
      && fixed_regs[PIC_OFFSET_TABLE_REGNUM])
961
    SET_REGNO_REG_SET (set, PIC_OFFSET_TABLE_REGNUM);
962
#endif
963
 
964
  /* Mark all global registers, and all registers used by the epilogue
965
     as being live at the end of the function since they may be
966
     referenced by our caller.  */
967
  for (i = 0; i < FIRST_PSEUDO_REGISTER; i++)
968
    if (global_regs[i] || EPILOGUE_USES (i))
969
      SET_REGNO_REG_SET (set, i);
970
 
971
  if (HAVE_epilogue && epilogue_completed)
972
    {
973
      /* Mark all call-saved registers that we actually used.  */
974
      for (i = 0; i < FIRST_PSEUDO_REGISTER; i++)
975
        if (regs_ever_live[i] && ! LOCAL_REGNO (i)
976
            && ! TEST_HARD_REG_BIT (regs_invalidated_by_call, i))
977
          SET_REGNO_REG_SET (set, i);
978
    }
979
 
980
#ifdef EH_RETURN_DATA_REGNO
981
  /* Mark the registers that will contain data for the handler.  */
982
  if (reload_completed && current_function_calls_eh_return)
983
    for (i = 0; ; ++i)
984
      {
985
        unsigned regno = EH_RETURN_DATA_REGNO(i);
986
        if (regno == INVALID_REGNUM)
987
          break;
988
        SET_REGNO_REG_SET (set, regno);
989
      }
990
#endif
991
#ifdef EH_RETURN_STACKADJ_RTX
992
  if ((! HAVE_epilogue || ! epilogue_completed)
993
      && current_function_calls_eh_return)
994
    {
995
      rtx tmp = EH_RETURN_STACKADJ_RTX;
996
      if (tmp && REG_P (tmp))
997
        mark_reg (tmp, set);
998
    }
999
#endif
1000
#ifdef EH_RETURN_HANDLER_RTX
1001
  if ((! HAVE_epilogue || ! epilogue_completed)
1002
      && current_function_calls_eh_return)
1003
    {
1004
      rtx tmp = EH_RETURN_HANDLER_RTX;
1005
      if (tmp && REG_P (tmp))
1006
        mark_reg (tmp, set);
1007
    }
1008
#endif
1009
 
1010
  /* Mark function return value.  */
1011
  diddle_return_value (mark_reg, set);
1012
}
1013
 
1014
/* Propagate global life info around the graph of basic blocks.  Begin
1015
   considering blocks with their corresponding bit set in BLOCKS_IN.
1016
   If BLOCKS_IN is null, consider it the universal set.
1017
 
1018
   BLOCKS_OUT is set for every block that was changed.  */
1019
 
1020
static void
1021
calculate_global_regs_live (sbitmap blocks_in, sbitmap blocks_out, int flags)
1022
{
1023
  basic_block *queue, *qhead, *qtail, *qend, bb;
1024
  regset tmp, new_live_at_end, invalidated_by_call;
1025
  regset registers_made_dead;
1026
  bool failure_strategy_required = false;
1027
  int *block_accesses;
1028
 
1029
  /* The registers that are modified within this in block.  */
1030
  regset *local_sets;
1031
 
1032
  /* The registers that are conditionally modified within this block.
1033
     In other words, regs that are set only as part of a COND_EXEC.  */
1034
  regset *cond_local_sets;
1035
 
1036
  unsigned int i;
1037
 
1038
  /* Some passes used to forget clear aux field of basic block causing
1039
     sick behavior here.  */
1040
#ifdef ENABLE_CHECKING
1041
  FOR_BB_BETWEEN (bb, ENTRY_BLOCK_PTR, NULL, next_bb)
1042
    gcc_assert (!bb->aux);
1043
#endif
1044
 
1045
  tmp = ALLOC_REG_SET (&reg_obstack);
1046
  new_live_at_end = ALLOC_REG_SET (&reg_obstack);
1047
  invalidated_by_call = ALLOC_REG_SET (&reg_obstack);
1048
  registers_made_dead = ALLOC_REG_SET (&reg_obstack);
1049
 
1050
  /* Inconveniently, this is only readily available in hard reg set form.  */
1051
  for (i = 0; i < FIRST_PSEUDO_REGISTER; ++i)
1052
    if (TEST_HARD_REG_BIT (regs_invalidated_by_call, i))
1053
      SET_REGNO_REG_SET (invalidated_by_call, i);
1054
 
1055
  /* Allocate space for the sets of local properties.  */
1056
  local_sets = xcalloc (last_basic_block - (INVALID_BLOCK + 1),
1057
                        sizeof (regset));
1058
  cond_local_sets = xcalloc (last_basic_block - (INVALID_BLOCK + 1),
1059
                             sizeof (regset));
1060
 
1061
  /* Create a worklist.  Allocate an extra slot for ENTRY_BLOCK, and one
1062
     because the `head == tail' style test for an empty queue doesn't
1063
     work with a full queue.  */
1064
  queue = xmalloc ((n_basic_blocks - (INVALID_BLOCK + 1)) * sizeof (*queue));
1065
  qtail = queue;
1066
  qhead = qend = queue + n_basic_blocks - (INVALID_BLOCK + 1);
1067
 
1068
  /* Queue the blocks set in the initial mask.  Do this in reverse block
1069
     number order so that we are more likely for the first round to do
1070
     useful work.  We use AUX non-null to flag that the block is queued.  */
1071
  if (blocks_in)
1072
    {
1073
      FOR_EACH_BB (bb)
1074
        if (TEST_BIT (blocks_in, bb->index))
1075
          {
1076
            *--qhead = bb;
1077
            bb->aux = bb;
1078
          }
1079
    }
1080
  else
1081
    {
1082
      FOR_EACH_BB (bb)
1083
        {
1084
          *--qhead = bb;
1085
          bb->aux = bb;
1086
        }
1087
    }
1088
 
1089
  block_accesses = xcalloc (last_basic_block, sizeof (int));
1090
 
1091
  /* We clean aux when we remove the initially-enqueued bbs, but we
1092
     don't enqueue ENTRY and EXIT initially, so clean them upfront and
1093
     unconditionally.  */
1094
  ENTRY_BLOCK_PTR->aux = EXIT_BLOCK_PTR->aux = NULL;
1095
 
1096
  if (blocks_out)
1097
    sbitmap_zero (blocks_out);
1098
 
1099
  /* We work through the queue until there are no more blocks.  What
1100
     is live at the end of this block is precisely the union of what
1101
     is live at the beginning of all its successors.  So, we set its
1102
     GLOBAL_LIVE_AT_END field based on the GLOBAL_LIVE_AT_START field
1103
     for its successors.  Then, we compute GLOBAL_LIVE_AT_START for
1104
     this block by walking through the instructions in this block in
1105
     reverse order and updating as we go.  If that changed
1106
     GLOBAL_LIVE_AT_START, we add the predecessors of the block to the
1107
     queue; they will now need to recalculate GLOBAL_LIVE_AT_END.
1108
 
1109
     We are guaranteed to terminate, because GLOBAL_LIVE_AT_START
1110
     never shrinks.  If a register appears in GLOBAL_LIVE_AT_START, it
1111
     must either be live at the end of the block, or used within the
1112
     block.  In the latter case, it will certainly never disappear
1113
     from GLOBAL_LIVE_AT_START.  In the former case, the register
1114
     could go away only if it disappeared from GLOBAL_LIVE_AT_START
1115
     for one of the successor blocks.  By induction, that cannot
1116
     occur.
1117
 
1118
     ??? This reasoning doesn't work if we start from non-empty initial
1119
     GLOBAL_LIVE_AT_START sets.  And there are actually two problems:
1120
       1) Updating may not terminate (endless oscillation).
1121
       2) Even if it does (and it usually does), the resulting information
1122
          may be inaccurate.  Consider for example the following case:
1123
 
1124
          a = ...;
1125
          while (...) {...}  -- 'a' not mentioned at all
1126
          ... = a;
1127
 
1128
          If the use of 'a' is deleted between two calculations of liveness
1129
          information and the initial sets are not cleared, the information
1130
          about a's liveness will get stuck inside the loop and the set will
1131
          appear not to be dead.
1132
 
1133
     We do not attempt to solve 2) -- the information is conservatively
1134
     correct (i.e. we never claim that something live is dead) and the
1135
     amount of optimization opportunities missed due to this problem is
1136
     not significant.
1137
 
1138
     1) is more serious.  In order to fix it, we monitor the number of times
1139
     each block is processed.  Once one of the blocks has been processed more
1140
     times than the maximum number of rounds, we use the following strategy:
1141
     When a register disappears from one of the sets, we add it to a MAKE_DEAD
1142
     set, remove all registers in this set from all GLOBAL_LIVE_AT_* sets and
1143
     add the blocks with changed sets into the queue.  Thus we are guaranteed
1144
     to terminate (the worst case corresponds to all registers in MADE_DEAD,
1145
     in which case the original reasoning above is valid), but in general we
1146
     only fix up a few offending registers.
1147
 
1148
     The maximum number of rounds for computing liveness is the largest of
1149
     MAX_LIVENESS_ROUNDS and the latest loop depth count for this function.  */
1150
 
1151
  while (qhead != qtail)
1152
    {
1153
      int rescan, changed;
1154
      basic_block bb;
1155
      edge e;
1156
      edge_iterator ei;
1157
 
1158
      bb = *qhead++;
1159
      if (qhead == qend)
1160
        qhead = queue;
1161
      bb->aux = NULL;
1162
 
1163
      /* Should we start using the failure strategy?  */
1164
      if (bb != ENTRY_BLOCK_PTR)
1165
        {
1166
          int max_liveness_rounds =
1167
            MAX (MAX_LIVENESS_ROUNDS, cfun->max_loop_depth);
1168
 
1169
          block_accesses[bb->index]++;
1170
          if (block_accesses[bb->index] > max_liveness_rounds)
1171
            failure_strategy_required = true;
1172
        }
1173
 
1174
      /* Begin by propagating live_at_start from the successor blocks.  */
1175
      CLEAR_REG_SET (new_live_at_end);
1176
 
1177
      if (EDGE_COUNT (bb->succs) > 0)
1178
        FOR_EACH_EDGE (e, ei, bb->succs)
1179
          {
1180
            basic_block sb = e->dest;
1181
 
1182
            /* Call-clobbered registers die across exception and
1183
               call edges.  */
1184
            /* ??? Abnormal call edges ignored for the moment, as this gets
1185
               confused by sibling call edges, which crashes reg-stack.  */
1186
            if (e->flags & EDGE_EH)
1187
              bitmap_ior_and_compl_into (new_live_at_end,
1188
                                         sb->il.rtl->global_live_at_start,
1189
                                         invalidated_by_call);
1190
            else
1191
              IOR_REG_SET (new_live_at_end, sb->il.rtl->global_live_at_start);
1192
 
1193
            /* If a target saves one register in another (instead of on
1194
               the stack) the save register will need to be live for EH.  */
1195
            if (e->flags & EDGE_EH)
1196
              for (i = 0; i < FIRST_PSEUDO_REGISTER; i++)
1197
                if (EH_USES (i))
1198
                  SET_REGNO_REG_SET (new_live_at_end, i);
1199
          }
1200
      else
1201
        {
1202
          /* This might be a noreturn function that throws.  And
1203
             even if it isn't, getting the unwind info right helps
1204
             debugging.  */
1205
          for (i = 0; i < FIRST_PSEUDO_REGISTER; i++)
1206
            if (EH_USES (i))
1207
              SET_REGNO_REG_SET (new_live_at_end, i);
1208
        }
1209
 
1210
      /* The all-important stack pointer must always be live.  */
1211
      SET_REGNO_REG_SET (new_live_at_end, STACK_POINTER_REGNUM);
1212
 
1213
      /* Before reload, there are a few registers that must be forced
1214
         live everywhere -- which might not already be the case for
1215
         blocks within infinite loops.  */
1216
      if (! reload_completed)
1217
        {
1218
          /* Any reference to any pseudo before reload is a potential
1219
             reference of the frame pointer.  */
1220
          SET_REGNO_REG_SET (new_live_at_end, FRAME_POINTER_REGNUM);
1221
 
1222
#if FRAME_POINTER_REGNUM != ARG_POINTER_REGNUM
1223
          /* Pseudos with argument area equivalences may require
1224
             reloading via the argument pointer.  */
1225
          if (fixed_regs[ARG_POINTER_REGNUM])
1226
            SET_REGNO_REG_SET (new_live_at_end, ARG_POINTER_REGNUM);
1227
#endif
1228
 
1229
          /* Any constant, or pseudo with constant equivalences, may
1230
             require reloading from memory using the pic register.  */
1231
          if ((unsigned) PIC_OFFSET_TABLE_REGNUM != INVALID_REGNUM
1232
              && fixed_regs[PIC_OFFSET_TABLE_REGNUM])
1233
            SET_REGNO_REG_SET (new_live_at_end, PIC_OFFSET_TABLE_REGNUM);
1234
        }
1235
 
1236
      if (bb == ENTRY_BLOCK_PTR)
1237
        {
1238
          COPY_REG_SET (bb->il.rtl->global_live_at_end, new_live_at_end);
1239
          continue;
1240
        }
1241
 
1242
      /* On our first pass through this block, we'll go ahead and continue.
1243
         Recognize first pass by checking if local_set is NULL for this
1244
         basic block.  On subsequent passes, we get to skip out early if
1245
         live_at_end wouldn't have changed.  */
1246
 
1247
      if (local_sets[bb->index - (INVALID_BLOCK + 1)] == NULL)
1248
        {
1249
          local_sets[bb->index - (INVALID_BLOCK + 1)]
1250
            = ALLOC_REG_SET (&reg_obstack);
1251
          cond_local_sets[bb->index - (INVALID_BLOCK + 1)]
1252
            = ALLOC_REG_SET (&reg_obstack);
1253
          rescan = 1;
1254
        }
1255
      else
1256
        {
1257
          /* If any bits were removed from live_at_end, we'll have to
1258
             rescan the block.  This wouldn't be necessary if we had
1259
             precalculated local_live, however with PROP_SCAN_DEAD_CODE
1260
             local_live is really dependent on live_at_end.  */
1261
          rescan = bitmap_intersect_compl_p (bb->il.rtl->global_live_at_end,
1262
                                             new_live_at_end);
1263
 
1264
          if (!rescan)
1265
            {
1266
              regset cond_local_set;
1267
 
1268
               /* If any of the registers in the new live_at_end set are
1269
                  conditionally set in this basic block, we must rescan.
1270
                  This is because conditional lifetimes at the end of the
1271
                  block do not just take the live_at_end set into
1272
                  account, but also the liveness at the start of each
1273
                  successor block.  We can miss changes in those sets if
1274
                  we only compare the new live_at_end against the
1275
                  previous one.  */
1276
              cond_local_set = cond_local_sets[bb->index - (INVALID_BLOCK + 1)];
1277
              rescan = bitmap_intersect_p (new_live_at_end, cond_local_set);
1278
            }
1279
 
1280
          if (!rescan)
1281
            {
1282
              regset local_set;
1283
 
1284
              /* Find the set of changed bits.  Take this opportunity
1285
                 to notice that this set is empty and early out.  */
1286
              bitmap_xor (tmp, bb->il.rtl->global_live_at_end, new_live_at_end);
1287
              if (bitmap_empty_p (tmp))
1288
                continue;
1289
 
1290
              /* If any of the changed bits overlap with local_sets[bb],
1291
                 we'll have to rescan the block.  */
1292
              local_set = local_sets[bb->index - (INVALID_BLOCK + 1)];
1293
              rescan = bitmap_intersect_p (tmp, local_set);
1294
            }
1295
        }
1296
 
1297
      /* Let our caller know that BB changed enough to require its
1298
         death notes updated.  */
1299
      if (blocks_out)
1300
        SET_BIT (blocks_out, bb->index);
1301
 
1302
      if (! rescan)
1303
        {
1304
          /* Add to live_at_start the set of all registers in
1305
             new_live_at_end that aren't in the old live_at_end.  */
1306
 
1307
          changed = bitmap_ior_and_compl_into (bb->il.rtl->global_live_at_start,
1308
                                               new_live_at_end,
1309
                                               bb->il.rtl->global_live_at_end);
1310
          COPY_REG_SET (bb->il.rtl->global_live_at_end, new_live_at_end);
1311
          if (! changed)
1312
            continue;
1313
        }
1314
      else
1315
        {
1316
          COPY_REG_SET (bb->il.rtl->global_live_at_end, new_live_at_end);
1317
 
1318
          /* Rescan the block insn by insn to turn (a copy of) live_at_end
1319
             into live_at_start.  */
1320
          propagate_block (bb, new_live_at_end,
1321
                           local_sets[bb->index - (INVALID_BLOCK + 1)],
1322
                           cond_local_sets[bb->index - (INVALID_BLOCK + 1)],
1323
                           flags);
1324
 
1325
          /* If live_at start didn't change, no need to go farther.  */
1326
          if (REG_SET_EQUAL_P (bb->il.rtl->global_live_at_start,
1327
                               new_live_at_end))
1328
            continue;
1329
 
1330
          if (failure_strategy_required)
1331
            {
1332
              /* Get the list of registers that were removed from the
1333
                 bb->global_live_at_start set.  */
1334
              bitmap_and_compl (tmp, bb->il.rtl->global_live_at_start,
1335
                                new_live_at_end);
1336
              if (!bitmap_empty_p (tmp))
1337
                {
1338
                  bool pbb_changed;
1339
                  basic_block pbb;
1340
 
1341
                  /* It should not happen that one of registers we have
1342
                     removed last time is disappears again before any other
1343
                     register does.  */
1344
                  pbb_changed = bitmap_ior_into (registers_made_dead, tmp);
1345
                  gcc_assert (pbb_changed);
1346
 
1347
                  /* Now remove the registers from all sets.  */
1348
                  FOR_EACH_BB (pbb)
1349
                    {
1350
                      pbb_changed = false;
1351
 
1352
                      pbb_changed
1353
                        |= bitmap_and_compl_into
1354
                            (pbb->il.rtl->global_live_at_start,
1355
                             registers_made_dead);
1356
                      pbb_changed
1357
                        |= bitmap_and_compl_into
1358
                            (pbb->il.rtl->global_live_at_end,
1359
                             registers_made_dead);
1360
                      if (!pbb_changed)
1361
                        continue;
1362
 
1363
                      /* Note the (possible) change.  */
1364
                      if (blocks_out)
1365
                        SET_BIT (blocks_out, pbb->index);
1366
 
1367
                      /* Makes sure to really rescan the block.  */
1368
                      if (local_sets[pbb->index - (INVALID_BLOCK + 1)])
1369
                        {
1370
                          FREE_REG_SET (local_sets[pbb->index - (INVALID_BLOCK + 1)]);
1371
                          FREE_REG_SET (cond_local_sets[pbb->index - (INVALID_BLOCK + 1)]);
1372
                          local_sets[pbb->index - (INVALID_BLOCK + 1)] = 0;
1373
                        }
1374
 
1375
                      /* Add it to the queue.  */
1376
                      if (pbb->aux == NULL)
1377
                        {
1378
                          *qtail++ = pbb;
1379
                          if (qtail == qend)
1380
                            qtail = queue;
1381
                          pbb->aux = pbb;
1382
                        }
1383
                    }
1384
                  continue;
1385
                }
1386
            } /* end of failure_strategy_required */
1387
 
1388
          COPY_REG_SET (bb->il.rtl->global_live_at_start, new_live_at_end);
1389
        }
1390
 
1391
      /* Queue all predecessors of BB so that we may re-examine
1392
         their live_at_end.  */
1393
      FOR_EACH_EDGE (e, ei, bb->preds)
1394
        {
1395
          basic_block pb = e->src;
1396
          if (pb->aux == NULL)
1397
            {
1398
              *qtail++ = pb;
1399
              if (qtail == qend)
1400
                qtail = queue;
1401
              pb->aux = pb;
1402
            }
1403
        }
1404
    }
1405
 
1406
  FREE_REG_SET (tmp);
1407
  FREE_REG_SET (new_live_at_end);
1408
  FREE_REG_SET (invalidated_by_call);
1409
  FREE_REG_SET (registers_made_dead);
1410
 
1411
  if (blocks_out)
1412
    {
1413
      sbitmap_iterator sbi;
1414
 
1415
      EXECUTE_IF_SET_IN_SBITMAP (blocks_out, 0, i, sbi)
1416
        {
1417
          basic_block bb = BASIC_BLOCK (i);
1418
          FREE_REG_SET (local_sets[bb->index - (INVALID_BLOCK + 1)]);
1419
          FREE_REG_SET (cond_local_sets[bb->index - (INVALID_BLOCK + 1)]);
1420
        };
1421
    }
1422
  else
1423
    {
1424
      FOR_EACH_BB (bb)
1425
        {
1426
          FREE_REG_SET (local_sets[bb->index - (INVALID_BLOCK + 1)]);
1427
          FREE_REG_SET (cond_local_sets[bb->index - (INVALID_BLOCK + 1)]);
1428
        }
1429
    }
1430
 
1431
  free (block_accesses);
1432
  free (queue);
1433
  free (cond_local_sets);
1434
  free (local_sets);
1435
}
1436
 
1437
 
1438
/* This structure is used to pass parameters to and from the
1439
   the function find_regno_partial(). It is used to pass in the
1440
   register number we are looking, as well as to return any rtx
1441
   we find.  */
1442
 
1443
typedef struct {
1444
  unsigned regno_to_find;
1445
  rtx retval;
1446
} find_regno_partial_param;
1447
 
1448
 
1449
/* Find the rtx for the reg numbers specified in 'data' if it is
1450
   part of an expression which only uses part of the register.  Return
1451
   it in the structure passed in.  */
1452
static int
1453
find_regno_partial (rtx *ptr, void *data)
1454
{
1455
  find_regno_partial_param *param = (find_regno_partial_param *)data;
1456
  unsigned reg = param->regno_to_find;
1457
  param->retval = NULL_RTX;
1458
 
1459
  if (*ptr == NULL_RTX)
1460
    return 0;
1461
 
1462
  switch (GET_CODE (*ptr))
1463
    {
1464
    case ZERO_EXTRACT:
1465
    case SIGN_EXTRACT:
1466
    case STRICT_LOW_PART:
1467
      if (REG_P (XEXP (*ptr, 0)) && REGNO (XEXP (*ptr, 0)) == reg)
1468
        {
1469
          param->retval = XEXP (*ptr, 0);
1470
          return 1;
1471
        }
1472
      break;
1473
 
1474
    case SUBREG:
1475
      if (REG_P (SUBREG_REG (*ptr))
1476
          && REGNO (SUBREG_REG (*ptr)) == reg)
1477
        {
1478
          param->retval = SUBREG_REG (*ptr);
1479
          return 1;
1480
        }
1481
      break;
1482
 
1483
    default:
1484
      break;
1485
    }
1486
 
1487
  return 0;
1488
}
1489
 
1490
/* Process all immediate successors of the entry block looking for pseudo
1491
   registers which are live on entry. Find all of those whose first
1492
   instance is a partial register reference of some kind, and initialize
1493
   them to 0 after the entry block.  This will prevent bit sets within
1494
   registers whose value is unknown, and may contain some kind of sticky
1495
   bits we don't want.  */
1496
 
1497
int
1498
initialize_uninitialized_subregs (void)
1499
{
1500
  rtx insn;
1501
  edge e;
1502
  unsigned reg, did_something = 0;
1503
  find_regno_partial_param param;
1504
  edge_iterator ei;
1505
 
1506
  FOR_EACH_EDGE (e, ei, ENTRY_BLOCK_PTR->succs)
1507
    {
1508
      basic_block bb = e->dest;
1509
      regset map = bb->il.rtl->global_live_at_start;
1510
      reg_set_iterator rsi;
1511
 
1512
      EXECUTE_IF_SET_IN_REG_SET (map, FIRST_PSEUDO_REGISTER, reg, rsi)
1513
        {
1514
          int uid = REGNO_FIRST_UID (reg);
1515
          rtx i;
1516
 
1517
          /* Find an insn which mentions the register we are looking for.
1518
             Its preferable to have an instance of the register's rtl since
1519
             there may be various flags set which we need to duplicate.
1520
             If we can't find it, its probably an automatic whose initial
1521
             value doesn't matter, or hopefully something we don't care about.  */
1522
          for (i = get_insns (); i && INSN_UID (i) != uid; i = NEXT_INSN (i))
1523
            ;
1524
          if (i != NULL_RTX)
1525
            {
1526
              /* Found the insn, now get the REG rtx, if we can.  */
1527
              param.regno_to_find = reg;
1528
              for_each_rtx (&i, find_regno_partial, &param);
1529
              if (param.retval != NULL_RTX)
1530
                {
1531
                  start_sequence ();
1532
                  emit_move_insn (param.retval,
1533
                                  CONST0_RTX (GET_MODE (param.retval)));
1534
                  insn = get_insns ();
1535
                  end_sequence ();
1536
                  insert_insn_on_edge (insn, e);
1537
                  did_something = 1;
1538
                }
1539
            }
1540
        }
1541
    }
1542
 
1543
  if (did_something)
1544
    commit_edge_insertions ();
1545
  return did_something;
1546
}
1547
 
1548
 
1549
/* Subroutines of life analysis.  */
1550
 
1551
/* Allocate the permanent data structures that represent the results
1552
   of life analysis.  */
1553
 
1554
static void
1555
allocate_bb_life_data (void)
1556
{
1557
  basic_block bb;
1558
 
1559
  FOR_BB_BETWEEN (bb, ENTRY_BLOCK_PTR, NULL, next_bb)
1560
    {
1561
      bb->il.rtl->global_live_at_start = ALLOC_REG_SET (&reg_obstack);
1562
      bb->il.rtl->global_live_at_end = ALLOC_REG_SET (&reg_obstack);
1563
    }
1564
 
1565
  regs_live_at_setjmp = ALLOC_REG_SET (&reg_obstack);
1566
}
1567
 
1568
void
1569
allocate_reg_life_data (void)
1570
{
1571
  int i;
1572
 
1573
  max_regno = max_reg_num ();
1574
  gcc_assert (!reg_deaths);
1575
  reg_deaths = xcalloc (sizeof (*reg_deaths), max_regno);
1576
 
1577
  /* Recalculate the register space, in case it has grown.  Old style
1578
     vector oriented regsets would set regset_{size,bytes} here also.  */
1579
  allocate_reg_info (max_regno, FALSE, FALSE);
1580
 
1581
  /* Reset all the data we'll collect in propagate_block and its
1582
     subroutines.  */
1583
  for (i = 0; i < max_regno; i++)
1584
    {
1585
      REG_N_SETS (i) = 0;
1586
      REG_N_REFS (i) = 0;
1587
      REG_N_DEATHS (i) = 0;
1588
      REG_N_CALLS_CROSSED (i) = 0;
1589
      REG_N_THROWING_CALLS_CROSSED (i) = 0;
1590
      REG_LIVE_LENGTH (i) = 0;
1591
      REG_FREQ (i) = 0;
1592
      REG_BASIC_BLOCK (i) = REG_BLOCK_UNKNOWN;
1593
    }
1594
}
1595
 
1596
/* Delete dead instructions for propagate_block.  */
1597
 
1598
static void
1599
propagate_block_delete_insn (rtx insn)
1600
{
1601
  rtx inote = find_reg_note (insn, REG_LABEL, NULL_RTX);
1602
 
1603
  /* If the insn referred to a label, and that label was attached to
1604
     an ADDR_VEC, it's safe to delete the ADDR_VEC.  In fact, it's
1605
     pretty much mandatory to delete it, because the ADDR_VEC may be
1606
     referencing labels that no longer exist.
1607
 
1608
     INSN may reference a deleted label, particularly when a jump
1609
     table has been optimized into a direct jump.  There's no
1610
     real good way to fix up the reference to the deleted label
1611
     when the label is deleted, so we just allow it here.  */
1612
 
1613
  if (inote && LABEL_P (inote))
1614
    {
1615
      rtx label = XEXP (inote, 0);
1616
      rtx next;
1617
 
1618
      /* The label may be forced if it has been put in the constant
1619
         pool.  If that is the only use we must discard the table
1620
         jump following it, but not the label itself.  */
1621
      if (LABEL_NUSES (label) == 1 + LABEL_PRESERVE_P (label)
1622
          && (next = next_nonnote_insn (label)) != NULL
1623
          && JUMP_P (next)
1624
          && (GET_CODE (PATTERN (next)) == ADDR_VEC
1625
              || GET_CODE (PATTERN (next)) == ADDR_DIFF_VEC))
1626
        {
1627
          rtx pat = PATTERN (next);
1628
          int diff_vec_p = GET_CODE (pat) == ADDR_DIFF_VEC;
1629
          int len = XVECLEN (pat, diff_vec_p);
1630
          int i;
1631
 
1632
          for (i = 0; i < len; i++)
1633
            LABEL_NUSES (XEXP (XVECEXP (pat, diff_vec_p, i), 0))--;
1634
 
1635
          delete_insn_and_edges (next);
1636
          ndead++;
1637
        }
1638
    }
1639
 
1640
  delete_insn_and_edges (insn);
1641
  ndead++;
1642
}
1643
 
1644
/* Delete dead libcalls for propagate_block.  Return the insn
1645
   before the libcall.  */
1646
 
1647
static rtx
1648
propagate_block_delete_libcall (rtx insn, rtx note)
1649
{
1650
  rtx first = XEXP (note, 0);
1651
  rtx before = PREV_INSN (first);
1652
 
1653
  delete_insn_chain_and_edges (first, insn);
1654
  ndead++;
1655
  return before;
1656
}
1657
 
1658
/* Update the life-status of regs for one insn.  Return the previous insn.  */
1659
 
1660
rtx
1661
propagate_one_insn (struct propagate_block_info *pbi, rtx insn)
1662
{
1663
  rtx prev = PREV_INSN (insn);
1664
  int flags = pbi->flags;
1665
  int insn_is_dead = 0;
1666
  int libcall_is_dead = 0;
1667
  rtx note;
1668
  unsigned i;
1669
 
1670
  if (! INSN_P (insn))
1671
    return prev;
1672
 
1673
  note = find_reg_note (insn, REG_RETVAL, NULL_RTX);
1674
  if (flags & PROP_SCAN_DEAD_CODE)
1675
    {
1676
      insn_is_dead = insn_dead_p (pbi, PATTERN (insn), 0, REG_NOTES (insn));
1677
      libcall_is_dead = (insn_is_dead && note != 0
1678
                         && libcall_dead_p (pbi, note, insn));
1679
    }
1680
 
1681
  /* If an instruction consists of just dead store(s) on final pass,
1682
     delete it.  */
1683
  if ((flags & PROP_KILL_DEAD_CODE) && insn_is_dead)
1684
    {
1685
      /* If we're trying to delete a prologue or epilogue instruction
1686
         that isn't flagged as possibly being dead, something is wrong.
1687
         But if we are keeping the stack pointer depressed, we might well
1688
         be deleting insns that are used to compute the amount to update
1689
         it by, so they are fine.  */
1690
      if (reload_completed
1691
          && !(TREE_CODE (TREE_TYPE (current_function_decl)) == FUNCTION_TYPE
1692
                && (TYPE_RETURNS_STACK_DEPRESSED
1693
                    (TREE_TYPE (current_function_decl))))
1694
          && (((HAVE_epilogue || HAVE_prologue)
1695
               && prologue_epilogue_contains (insn))
1696
              || (HAVE_sibcall_epilogue
1697
                  && sibcall_epilogue_contains (insn)))
1698
          && find_reg_note (insn, REG_MAYBE_DEAD, NULL_RTX) == 0)
1699
        fatal_insn ("Attempt to delete prologue/epilogue insn:", insn);
1700
 
1701
      /* Record sets.  Do this even for dead instructions, since they
1702
         would have killed the values if they hadn't been deleted.  To
1703
         be consistent, we also have to emit a clobber when we delete
1704
         an insn that clobbers a live register.  */
1705
      pbi->flags |= PROP_DEAD_INSN;
1706
      mark_set_regs (pbi, PATTERN (insn), insn);
1707
      pbi->flags &= ~PROP_DEAD_INSN;
1708
 
1709
      /* CC0 is now known to be dead.  Either this insn used it,
1710
         in which case it doesn't anymore, or clobbered it,
1711
         so the next insn can't use it.  */
1712
      pbi->cc0_live = 0;
1713
 
1714
      if (libcall_is_dead)
1715
        prev = propagate_block_delete_libcall (insn, note);
1716
      else
1717
        {
1718
 
1719
        /* If INSN contains a RETVAL note and is dead, but the libcall
1720
           as a whole is not dead, then we want to remove INSN, but
1721
           not the whole libcall sequence.
1722
 
1723
           However, we need to also remove the dangling REG_LIBCALL
1724
           note so that we do not have mis-matched LIBCALL/RETVAL
1725
           notes.  In theory we could find a new location for the
1726
           REG_RETVAL note, but it hardly seems worth the effort.
1727
 
1728
           NOTE at this point will be the RETVAL note if it exists.  */
1729
          if (note)
1730
            {
1731
              rtx libcall_note;
1732
 
1733
              libcall_note
1734
                = find_reg_note (XEXP (note, 0), REG_LIBCALL, NULL_RTX);
1735
              remove_note (XEXP (note, 0), libcall_note);
1736
            }
1737
 
1738
          /* Similarly if INSN contains a LIBCALL note, remove the
1739
             dangling REG_RETVAL note.  */
1740
          note = find_reg_note (insn, REG_LIBCALL, NULL_RTX);
1741
          if (note)
1742
            {
1743
              rtx retval_note;
1744
 
1745
              retval_note
1746
                = find_reg_note (XEXP (note, 0), REG_RETVAL, NULL_RTX);
1747
              remove_note (XEXP (note, 0), retval_note);
1748
            }
1749
 
1750
          /* Now delete INSN.  */
1751
          propagate_block_delete_insn (insn);
1752
        }
1753
 
1754
      return prev;
1755
    }
1756
 
1757
  /* See if this is an increment or decrement that can be merged into
1758
     a following memory address.  */
1759
#ifdef AUTO_INC_DEC
1760
  {
1761
    rtx x = single_set (insn);
1762
 
1763
    /* Does this instruction increment or decrement a register?  */
1764
    if ((flags & PROP_AUTOINC)
1765
        && x != 0
1766
        && REG_P (SET_DEST (x))
1767
        && (GET_CODE (SET_SRC (x)) == PLUS
1768
            || GET_CODE (SET_SRC (x)) == MINUS)
1769
        && XEXP (SET_SRC (x), 0) == SET_DEST (x)
1770
        && GET_CODE (XEXP (SET_SRC (x), 1)) == CONST_INT
1771
        /* Ok, look for a following memory ref we can combine with.
1772
           If one is found, change the memory ref to a PRE_INC
1773
           or PRE_DEC, cancel this insn, and return 1.
1774
           Return 0 if nothing has been done.  */
1775
        && try_pre_increment_1 (pbi, insn))
1776
      return prev;
1777
  }
1778
#endif /* AUTO_INC_DEC */
1779
 
1780
  CLEAR_REG_SET (pbi->new_set);
1781
 
1782
  /* If this is not the final pass, and this insn is copying the value of
1783
     a library call and it's dead, don't scan the insns that perform the
1784
     library call, so that the call's arguments are not marked live.  */
1785
  if (libcall_is_dead)
1786
    {
1787
      /* Record the death of the dest reg.  */
1788
      mark_set_regs (pbi, PATTERN (insn), insn);
1789
 
1790
      insn = XEXP (note, 0);
1791
      return PREV_INSN (insn);
1792
    }
1793
  else if (GET_CODE (PATTERN (insn)) == SET
1794
           && SET_DEST (PATTERN (insn)) == stack_pointer_rtx
1795
           && GET_CODE (SET_SRC (PATTERN (insn))) == PLUS
1796
           && XEXP (SET_SRC (PATTERN (insn)), 0) == stack_pointer_rtx
1797
           && GET_CODE (XEXP (SET_SRC (PATTERN (insn)), 1)) == CONST_INT)
1798
    {
1799
      /* We have an insn to pop a constant amount off the stack.
1800
         (Such insns use PLUS regardless of the direction of the stack,
1801
         and any insn to adjust the stack by a constant is always a pop
1802
         or part of a push.)
1803
         These insns, if not dead stores, have no effect on life, though
1804
         they do have an effect on the memory stores we are tracking.  */
1805
      invalidate_mems_from_set (pbi, stack_pointer_rtx);
1806
      /* Still, we need to update local_set, lest ifcvt.c:dead_or_predicable
1807
         concludes that the stack pointer is not modified.  */
1808
      mark_set_regs (pbi, PATTERN (insn), insn);
1809
    }
1810
  else
1811
    {
1812
      /* Any regs live at the time of a call instruction must not go
1813
         in a register clobbered by calls.  Find all regs now live and
1814
         record this for them.  */
1815
 
1816
      if (CALL_P (insn) && (flags & PROP_REG_INFO))
1817
        {
1818
          reg_set_iterator rsi;
1819
          EXECUTE_IF_SET_IN_REG_SET (pbi->reg_live, 0, i, rsi)
1820
            REG_N_CALLS_CROSSED (i)++;
1821
          if (can_throw_internal (insn))
1822
            EXECUTE_IF_SET_IN_REG_SET (pbi->reg_live, 0, i, rsi)
1823
              REG_N_THROWING_CALLS_CROSSED (i)++;
1824
        }
1825
 
1826
      /* Record sets.  Do this even for dead instructions, since they
1827
         would have killed the values if they hadn't been deleted.  */
1828
      mark_set_regs (pbi, PATTERN (insn), insn);
1829
 
1830
      if (CALL_P (insn))
1831
        {
1832
          regset live_at_end;
1833
          bool sibcall_p;
1834
          rtx note, cond;
1835
          int i;
1836
 
1837
          cond = NULL_RTX;
1838
          if (GET_CODE (PATTERN (insn)) == COND_EXEC)
1839
            cond = COND_EXEC_TEST (PATTERN (insn));
1840
 
1841
          /* Non-constant calls clobber memory, constant calls do not
1842
             clobber memory, though they may clobber outgoing arguments
1843
             on the stack.  */
1844
          if (! CONST_OR_PURE_CALL_P (insn))
1845
            {
1846
              free_EXPR_LIST_list (&pbi->mem_set_list);
1847
              pbi->mem_set_list_len = 0;
1848
            }
1849
          else
1850
            invalidate_mems_from_set (pbi, stack_pointer_rtx);
1851
 
1852
          /* There may be extra registers to be clobbered.  */
1853
          for (note = CALL_INSN_FUNCTION_USAGE (insn);
1854
               note;
1855
               note = XEXP (note, 1))
1856
            if (GET_CODE (XEXP (note, 0)) == CLOBBER)
1857
              mark_set_1 (pbi, CLOBBER, XEXP (XEXP (note, 0), 0),
1858
                          cond, insn, pbi->flags);
1859
 
1860
          /* Calls change all call-used and global registers; sibcalls do not
1861
             clobber anything that must be preserved at end-of-function,
1862
             except for return values.  */
1863
 
1864
          sibcall_p = SIBLING_CALL_P (insn);
1865
          live_at_end = EXIT_BLOCK_PTR->il.rtl->global_live_at_start;
1866
          for (i = 0; i < FIRST_PSEUDO_REGISTER; i++)
1867
            if (TEST_HARD_REG_BIT (regs_invalidated_by_call, i)
1868
                && ! (sibcall_p
1869
                      && REGNO_REG_SET_P (live_at_end, i)
1870
                      && ! refers_to_regno_p (i, i+1,
1871
                                              current_function_return_rtx,
1872
                                              (rtx *) 0)))
1873
              {
1874
                enum rtx_code code = global_regs[i] ? SET : CLOBBER;
1875
                /* We do not want REG_UNUSED notes for these registers.  */
1876
                mark_set_1 (pbi, code, regno_reg_rtx[i], cond, insn,
1877
                            pbi->flags & ~(PROP_DEATH_NOTES | PROP_REG_INFO));
1878
              }
1879
        }
1880
 
1881
      /* If an insn doesn't use CC0, it becomes dead since we assume
1882
         that every insn clobbers it.  So show it dead here;
1883
         mark_used_regs will set it live if it is referenced.  */
1884
      pbi->cc0_live = 0;
1885
 
1886
      /* Record uses.  */
1887
      if (! insn_is_dead)
1888
        mark_used_regs (pbi, PATTERN (insn), NULL_RTX, insn);
1889
 
1890
      /* Sometimes we may have inserted something before INSN (such as a move)
1891
         when we make an auto-inc.  So ensure we will scan those insns.  */
1892
#ifdef AUTO_INC_DEC
1893
      prev = PREV_INSN (insn);
1894
#endif
1895
 
1896
      if (! insn_is_dead && CALL_P (insn))
1897
        {
1898
          int i;
1899
          rtx note, cond;
1900
 
1901
          cond = NULL_RTX;
1902
          if (GET_CODE (PATTERN (insn)) == COND_EXEC)
1903
            cond = COND_EXEC_TEST (PATTERN (insn));
1904
 
1905
          /* Calls use their arguments, and may clobber memory which
1906
             address involves some register.  */
1907
          for (note = CALL_INSN_FUNCTION_USAGE (insn);
1908
               note;
1909
               note = XEXP (note, 1))
1910
            /* We find USE or CLOBBER entities in a FUNCTION_USAGE list: both
1911
               of which mark_used_regs knows how to handle.  */
1912
            mark_used_regs (pbi, XEXP (XEXP (note, 0), 0), cond, insn);
1913
 
1914
          /* The stack ptr is used (honorarily) by a CALL insn.  */
1915
          if ((flags & PROP_REG_INFO)
1916
              && !REGNO_REG_SET_P (pbi->reg_live, STACK_POINTER_REGNUM))
1917
            reg_deaths[STACK_POINTER_REGNUM] = pbi->insn_num;
1918
          SET_REGNO_REG_SET (pbi->reg_live, STACK_POINTER_REGNUM);
1919
 
1920
          /* Calls may also reference any of the global registers,
1921
             so they are made live.  */
1922
          for (i = 0; i < FIRST_PSEUDO_REGISTER; i++)
1923
            if (global_regs[i])
1924
              mark_used_reg (pbi, regno_reg_rtx[i], cond, insn);
1925
        }
1926
    }
1927
 
1928
  pbi->insn_num++;
1929
 
1930
  return prev;
1931
}
1932
 
1933
/* Initialize a propagate_block_info struct for public consumption.
1934
   Note that the structure itself is opaque to this file, but that
1935
   the user can use the regsets provided here.  */
1936
 
1937
struct propagate_block_info *
1938
init_propagate_block_info (basic_block bb, regset live, regset local_set,
1939
                           regset cond_local_set, int flags)
1940
{
1941
  struct propagate_block_info *pbi = xmalloc (sizeof (*pbi));
1942
 
1943
  pbi->bb = bb;
1944
  pbi->reg_live = live;
1945
  pbi->mem_set_list = NULL_RTX;
1946
  pbi->mem_set_list_len = 0;
1947
  pbi->local_set = local_set;
1948
  pbi->cond_local_set = cond_local_set;
1949
  pbi->cc0_live = 0;
1950
  pbi->flags = flags;
1951
  pbi->insn_num = 0;
1952
 
1953
  if (flags & (PROP_LOG_LINKS | PROP_AUTOINC))
1954
    pbi->reg_next_use = xcalloc (max_reg_num (), sizeof (rtx));
1955
  else
1956
    pbi->reg_next_use = NULL;
1957
 
1958
  pbi->new_set = BITMAP_ALLOC (NULL);
1959
 
1960
#ifdef HAVE_conditional_execution
1961
  pbi->reg_cond_dead = splay_tree_new (splay_tree_compare_ints, NULL,
1962
                                       free_reg_cond_life_info);
1963
  pbi->reg_cond_reg = BITMAP_ALLOC (NULL);
1964
 
1965
  /* If this block ends in a conditional branch, for each register
1966
     live from one side of the branch and not the other, record the
1967
     register as conditionally dead.  */
1968
  if (JUMP_P (BB_END (bb))
1969
      && any_condjump_p (BB_END (bb)))
1970
    {
1971
      regset diff = ALLOC_REG_SET (&reg_obstack);
1972
      basic_block bb_true, bb_false;
1973
      unsigned i;
1974
 
1975
      /* Identify the successor blocks.  */
1976
      bb_true = EDGE_SUCC (bb, 0)->dest;
1977
      if (!single_succ_p (bb))
1978
        {
1979
          bb_false = EDGE_SUCC (bb, 1)->dest;
1980
 
1981
          if (EDGE_SUCC (bb, 0)->flags & EDGE_FALLTHRU)
1982
            {
1983
              basic_block t = bb_false;
1984
              bb_false = bb_true;
1985
              bb_true = t;
1986
            }
1987
          else
1988
            gcc_assert (EDGE_SUCC (bb, 1)->flags & EDGE_FALLTHRU);
1989
        }
1990
      else
1991
        {
1992
          /* This can happen with a conditional jump to the next insn.  */
1993
          gcc_assert (JUMP_LABEL (BB_END (bb)) == BB_HEAD (bb_true));
1994
 
1995
          /* Simplest way to do nothing.  */
1996
          bb_false = bb_true;
1997
        }
1998
 
1999
      /* Compute which register lead different lives in the successors.  */
2000
      bitmap_xor (diff, bb_true->il.rtl->global_live_at_start,
2001
                  bb_false->il.rtl->global_live_at_start);
2002
 
2003
      if (!bitmap_empty_p (diff))
2004
          {
2005
          /* Extract the condition from the branch.  */
2006
          rtx set_src = SET_SRC (pc_set (BB_END (bb)));
2007
          rtx cond_true = XEXP (set_src, 0);
2008
          rtx reg = XEXP (cond_true, 0);
2009
          enum rtx_code inv_cond;
2010
 
2011
          if (GET_CODE (reg) == SUBREG)
2012
            reg = SUBREG_REG (reg);
2013
 
2014
          /* We can only track conditional lifetimes if the condition is
2015
             in the form of a reversible comparison of a register against
2016
             zero.  If the condition is more complex than that, then it is
2017
             safe not to record any information.  */
2018
          inv_cond = reversed_comparison_code (cond_true, BB_END (bb));
2019
          if (inv_cond != UNKNOWN
2020
              && REG_P (reg)
2021
              && XEXP (cond_true, 1) == const0_rtx)
2022
            {
2023
              rtx cond_false
2024
                = gen_rtx_fmt_ee (inv_cond,
2025
                                  GET_MODE (cond_true), XEXP (cond_true, 0),
2026
                                  XEXP (cond_true, 1));
2027
              reg_set_iterator rsi;
2028
 
2029
              if (GET_CODE (XEXP (set_src, 1)) == PC)
2030
                {
2031
                  rtx t = cond_false;
2032
                  cond_false = cond_true;
2033
                  cond_true = t;
2034
                }
2035
 
2036
              SET_REGNO_REG_SET (pbi->reg_cond_reg, REGNO (reg));
2037
 
2038
              /* For each such register, mark it conditionally dead.  */
2039
              EXECUTE_IF_SET_IN_REG_SET (diff, 0, i, rsi)
2040
                {
2041
                  struct reg_cond_life_info *rcli;
2042
                  rtx cond;
2043
 
2044
                  rcli = xmalloc (sizeof (*rcli));
2045
 
2046
                  if (REGNO_REG_SET_P (bb_true->il.rtl->global_live_at_start,
2047
                                       i))
2048
                    cond = cond_false;
2049
                  else
2050
                    cond = cond_true;
2051
                  rcli->condition = cond;
2052
                  rcli->stores = const0_rtx;
2053
                  rcli->orig_condition = cond;
2054
 
2055
                  splay_tree_insert (pbi->reg_cond_dead, i,
2056
                                     (splay_tree_value) rcli);
2057
                }
2058
            }
2059
        }
2060
 
2061
      FREE_REG_SET (diff);
2062
    }
2063
#endif
2064
 
2065
  /* If this block has no successors, any stores to the frame that aren't
2066
     used later in the block are dead.  So make a pass over the block
2067
     recording any such that are made and show them dead at the end.  We do
2068
     a very conservative and simple job here.  */
2069
  if (optimize
2070
      && ! (TREE_CODE (TREE_TYPE (current_function_decl)) == FUNCTION_TYPE
2071
            && (TYPE_RETURNS_STACK_DEPRESSED
2072
                (TREE_TYPE (current_function_decl))))
2073
      && (flags & PROP_SCAN_DEAD_STORES)
2074
      && (EDGE_COUNT (bb->succs) == 0
2075
          || (single_succ_p (bb)
2076
              && single_succ (bb) == EXIT_BLOCK_PTR
2077
              && ! current_function_calls_eh_return)))
2078
    {
2079
      rtx insn, set;
2080
      for (insn = BB_END (bb); insn != BB_HEAD (bb); insn = PREV_INSN (insn))
2081
        if (NONJUMP_INSN_P (insn)
2082
            && (set = single_set (insn))
2083
            && MEM_P (SET_DEST (set)))
2084
          {
2085
            rtx mem = SET_DEST (set);
2086
            rtx canon_mem = canon_rtx (mem);
2087
 
2088
            if (XEXP (canon_mem, 0) == frame_pointer_rtx
2089
                || (GET_CODE (XEXP (canon_mem, 0)) == PLUS
2090
                    && XEXP (XEXP (canon_mem, 0), 0) == frame_pointer_rtx
2091
                    && GET_CODE (XEXP (XEXP (canon_mem, 0), 1)) == CONST_INT))
2092
              add_to_mem_set_list (pbi, canon_mem);
2093
          }
2094
    }
2095
 
2096
  return pbi;
2097
}
2098
 
2099
/* Release a propagate_block_info struct.  */
2100
 
2101
void
2102
free_propagate_block_info (struct propagate_block_info *pbi)
2103
{
2104
  free_EXPR_LIST_list (&pbi->mem_set_list);
2105
 
2106
  BITMAP_FREE (pbi->new_set);
2107
 
2108
#ifdef HAVE_conditional_execution
2109
  splay_tree_delete (pbi->reg_cond_dead);
2110
  BITMAP_FREE (pbi->reg_cond_reg);
2111
#endif
2112
 
2113
  if (pbi->flags & PROP_REG_INFO)
2114
    {
2115
      int num = pbi->insn_num;
2116
      unsigned i;
2117
      reg_set_iterator rsi;
2118
 
2119
      EXECUTE_IF_SET_IN_REG_SET (pbi->reg_live, 0, i, rsi)
2120
        {
2121
          REG_LIVE_LENGTH (i) += num - reg_deaths[i];
2122
          reg_deaths[i] = 0;
2123
        }
2124
    }
2125
  if (pbi->reg_next_use)
2126
    free (pbi->reg_next_use);
2127
 
2128
  free (pbi);
2129
}
2130
 
2131
/* Compute the registers live at the beginning of a basic block BB from
2132
   those live at the end.
2133
 
2134
   When called, REG_LIVE contains those live at the end.  On return, it
2135
   contains those live at the beginning.
2136
 
2137
   LOCAL_SET, if non-null, will be set with all registers killed
2138
   unconditionally by this basic block.
2139
   Likewise, COND_LOCAL_SET, if non-null, will be set with all registers
2140
   killed conditionally by this basic block.  If there is any unconditional
2141
   set of a register, then the corresponding bit will be set in LOCAL_SET
2142
   and cleared in COND_LOCAL_SET.
2143
   It is valid for LOCAL_SET and COND_LOCAL_SET to be the same set.  In this
2144
   case, the resulting set will be equal to the union of the two sets that
2145
   would otherwise be computed.
2146
 
2147
   Return nonzero if an INSN is deleted (i.e. by dead code removal).  */
2148
 
2149
int
2150
propagate_block (basic_block bb, regset live, regset local_set,
2151
                 regset cond_local_set, int flags)
2152
{
2153
  struct propagate_block_info *pbi;
2154
  rtx insn, prev;
2155
  int changed;
2156
 
2157
  pbi = init_propagate_block_info (bb, live, local_set, cond_local_set, flags);
2158
 
2159
  if (flags & PROP_REG_INFO)
2160
    {
2161
      unsigned i;
2162
      reg_set_iterator rsi;
2163
 
2164
      /* Process the regs live at the end of the block.
2165
         Mark them as not local to any one basic block.  */
2166
      EXECUTE_IF_SET_IN_REG_SET (live, 0, i, rsi)
2167
        REG_BASIC_BLOCK (i) = REG_BLOCK_GLOBAL;
2168
    }
2169
 
2170
  /* Scan the block an insn at a time from end to beginning.  */
2171
 
2172
  changed = 0;
2173
  for (insn = BB_END (bb); ; insn = prev)
2174
    {
2175
      /* If this is a call to `setjmp' et al, warn if any
2176
         non-volatile datum is live.  */
2177
      if ((flags & PROP_REG_INFO)
2178
          && CALL_P (insn)
2179
          && find_reg_note (insn, REG_SETJMP, NULL))
2180
        IOR_REG_SET (regs_live_at_setjmp, pbi->reg_live);
2181
 
2182
      prev = propagate_one_insn (pbi, insn);
2183
      if (!prev)
2184
        changed |= insn != get_insns ();
2185
      else
2186
        changed |= NEXT_INSN (prev) != insn;
2187
 
2188
      if (insn == BB_HEAD (bb))
2189
        break;
2190
    }
2191
 
2192
  free_propagate_block_info (pbi);
2193
 
2194
  return changed;
2195
}
2196
 
2197
/* Return 1 if X (the body of an insn, or part of it) is just dead stores
2198
   (SET expressions whose destinations are registers dead after the insn).
2199
   NEEDED is the regset that says which regs are alive after the insn.
2200
 
2201
   Unless CALL_OK is nonzero, an insn is needed if it contains a CALL.
2202
 
2203
   If X is the entire body of an insn, NOTES contains the reg notes
2204
   pertaining to the insn.  */
2205
 
2206
static int
2207
insn_dead_p (struct propagate_block_info *pbi, rtx x, int call_ok,
2208
             rtx notes ATTRIBUTE_UNUSED)
2209
{
2210
  enum rtx_code code = GET_CODE (x);
2211
 
2212
  /* Don't eliminate insns that may trap.  */
2213
  if (flag_non_call_exceptions && may_trap_p (x))
2214
    return 0;
2215
 
2216
#ifdef AUTO_INC_DEC
2217
  /* As flow is invoked after combine, we must take existing AUTO_INC
2218
     expressions into account.  */
2219
  for (; notes; notes = XEXP (notes, 1))
2220
    {
2221
      if (REG_NOTE_KIND (notes) == REG_INC)
2222
        {
2223
          int regno = REGNO (XEXP (notes, 0));
2224
 
2225
          /* Don't delete insns to set global regs.  */
2226
          if ((regno < FIRST_PSEUDO_REGISTER && global_regs[regno])
2227
              || REGNO_REG_SET_P (pbi->reg_live, regno))
2228
            return 0;
2229
        }
2230
    }
2231
#endif
2232
 
2233
  /* If setting something that's a reg or part of one,
2234
     see if that register's altered value will be live.  */
2235
 
2236
  if (code == SET)
2237
    {
2238
      rtx r = SET_DEST (x);
2239
 
2240
#ifdef HAVE_cc0
2241
      if (GET_CODE (r) == CC0)
2242
        return ! pbi->cc0_live;
2243
#endif
2244
 
2245
      /* A SET that is a subroutine call cannot be dead.  */
2246
      if (GET_CODE (SET_SRC (x)) == CALL)
2247
        {
2248
          if (! call_ok)
2249
            return 0;
2250
        }
2251
 
2252
      /* Don't eliminate loads from volatile memory or volatile asms.  */
2253
      else if (volatile_refs_p (SET_SRC (x)))
2254
        return 0;
2255
 
2256
      if (MEM_P (r))
2257
        {
2258
          rtx temp, canon_r;
2259
 
2260
          if (MEM_VOLATILE_P (r) || GET_MODE (r) == BLKmode)
2261
            return 0;
2262
 
2263
          canon_r = canon_rtx (r);
2264
 
2265
          /* Walk the set of memory locations we are currently tracking
2266
             and see if one is an identical match to this memory location.
2267
             If so, this memory write is dead (remember, we're walking
2268
             backwards from the end of the block to the start).  Since
2269
             rtx_equal_p does not check the alias set or flags, we also
2270
             must have the potential for them to conflict (anti_dependence).  */
2271
          for (temp = pbi->mem_set_list; temp != 0; temp = XEXP (temp, 1))
2272
            if (anti_dependence (r, XEXP (temp, 0)))
2273
              {
2274
                rtx mem = XEXP (temp, 0);
2275
 
2276
                if (rtx_equal_p (XEXP (canon_r, 0), XEXP (mem, 0))
2277
                    && (GET_MODE_SIZE (GET_MODE (canon_r))
2278
                        <= GET_MODE_SIZE (GET_MODE (mem))))
2279
                  return 1;
2280
 
2281
#ifdef AUTO_INC_DEC
2282
                /* Check if memory reference matches an auto increment. Only
2283
                   post increment/decrement or modify are valid.  */
2284
                if (GET_MODE (mem) == GET_MODE (r)
2285
                    && (GET_CODE (XEXP (mem, 0)) == POST_DEC
2286
                        || GET_CODE (XEXP (mem, 0)) == POST_INC
2287
                        || GET_CODE (XEXP (mem, 0)) == POST_MODIFY)
2288
                    && GET_MODE (XEXP (mem, 0)) == GET_MODE (r)
2289
                    && rtx_equal_p (XEXP (XEXP (mem, 0), 0), XEXP (r, 0)))
2290
                  return 1;
2291
#endif
2292
              }
2293
        }
2294
      else
2295
        {
2296
          while (GET_CODE (r) == SUBREG
2297
                 || GET_CODE (r) == STRICT_LOW_PART
2298
                 || GET_CODE (r) == ZERO_EXTRACT)
2299
            r = XEXP (r, 0);
2300
 
2301
          if (REG_P (r))
2302
            {
2303
              int regno = REGNO (r);
2304
 
2305
              /* Obvious.  */
2306
              if (REGNO_REG_SET_P (pbi->reg_live, regno))
2307
                return 0;
2308
 
2309
              /* If this is a hard register, verify that subsequent
2310
                 words are not needed.  */
2311
              if (regno < FIRST_PSEUDO_REGISTER)
2312
                {
2313
                  int n = hard_regno_nregs[regno][GET_MODE (r)];
2314
 
2315
                  while (--n > 0)
2316
                    if (REGNO_REG_SET_P (pbi->reg_live, regno+n))
2317
                      return 0;
2318
                }
2319
 
2320
              /* Don't delete insns to set global regs.  */
2321
              if (regno < FIRST_PSEUDO_REGISTER && global_regs[regno])
2322
                return 0;
2323
 
2324
              /* Make sure insns to set the stack pointer aren't deleted.  */
2325
              if (regno == STACK_POINTER_REGNUM)
2326
                return 0;
2327
 
2328
              /* ??? These bits might be redundant with the force live bits
2329
                 in calculate_global_regs_live.  We would delete from
2330
                 sequential sets; whether this actually affects real code
2331
                 for anything but the stack pointer I don't know.  */
2332
              /* Make sure insns to set the frame pointer aren't deleted.  */
2333
              if (regno == FRAME_POINTER_REGNUM
2334
                  && (! reload_completed || frame_pointer_needed))
2335
                return 0;
2336
#if FRAME_POINTER_REGNUM != HARD_FRAME_POINTER_REGNUM
2337
              if (regno == HARD_FRAME_POINTER_REGNUM
2338
                  && (! reload_completed || frame_pointer_needed))
2339
                return 0;
2340
#endif
2341
 
2342
#if FRAME_POINTER_REGNUM != ARG_POINTER_REGNUM
2343
              /* Make sure insns to set arg pointer are never deleted
2344
                 (if the arg pointer isn't fixed, there will be a USE
2345
                 for it, so we can treat it normally).  */
2346
              if (regno == ARG_POINTER_REGNUM && fixed_regs[regno])
2347
                return 0;
2348
#endif
2349
 
2350
              /* Otherwise, the set is dead.  */
2351
              return 1;
2352
            }
2353
        }
2354
    }
2355
 
2356
  /* If performing several activities, insn is dead if each activity
2357
     is individually dead.  Also, CLOBBERs and USEs can be ignored; a
2358
     CLOBBER or USE that's inside a PARALLEL doesn't make the insn
2359
     worth keeping.  */
2360
  else if (code == PARALLEL)
2361
    {
2362
      int i = XVECLEN (x, 0);
2363
 
2364
      for (i--; i >= 0; i--)
2365
        if (GET_CODE (XVECEXP (x, 0, i)) != CLOBBER
2366
            && GET_CODE (XVECEXP (x, 0, i)) != USE
2367
            && ! insn_dead_p (pbi, XVECEXP (x, 0, i), call_ok, NULL_RTX))
2368
          return 0;
2369
 
2370
      return 1;
2371
    }
2372
 
2373
  /* A CLOBBER of a pseudo-register that is dead serves no purpose.  That
2374
     is not necessarily true for hard registers until after reload.  */
2375
  else if (code == CLOBBER)
2376
    {
2377
      if (REG_P (XEXP (x, 0))
2378
          && (REGNO (XEXP (x, 0)) >= FIRST_PSEUDO_REGISTER
2379
              || reload_completed)
2380
          && ! REGNO_REG_SET_P (pbi->reg_live, REGNO (XEXP (x, 0))))
2381
        return 1;
2382
    }
2383
 
2384
  /* ??? A base USE is a historical relic.  It ought not be needed anymore.
2385
     Instances where it is still used are either (1) temporary and the USE
2386
     escaped the pass, (2) cruft and the USE need not be emitted anymore,
2387
     or (3) hiding bugs elsewhere that are not properly representing data
2388
     flow.  */
2389
 
2390
  return 0;
2391
}
2392
 
2393
/* If INSN is the last insn in a libcall, and assuming INSN is dead,
2394
   return 1 if the entire library call is dead.
2395
   This is true if INSN copies a register (hard or pseudo)
2396
   and if the hard return reg of the call insn is dead.
2397
   (The caller should have tested the destination of the SET inside
2398
   INSN already for death.)
2399
 
2400
   If this insn doesn't just copy a register, then we don't
2401
   have an ordinary libcall.  In that case, cse could not have
2402
   managed to substitute the source for the dest later on,
2403
   so we can assume the libcall is dead.
2404
 
2405
   PBI is the block info giving pseudoregs live before this insn.
2406
   NOTE is the REG_RETVAL note of the insn.  */
2407
 
2408
static int
2409
libcall_dead_p (struct propagate_block_info *pbi, rtx note, rtx insn)
2410
{
2411
  rtx x = single_set (insn);
2412
 
2413
  if (x)
2414
    {
2415
      rtx r = SET_SRC (x);
2416
 
2417
      if (REG_P (r) || GET_CODE (r) == SUBREG)
2418
        {
2419
          rtx call = XEXP (note, 0);
2420
          rtx call_pat;
2421
          int i;
2422
 
2423
          /* Find the call insn.  */
2424
          while (call != insn && !CALL_P (call))
2425
            call = NEXT_INSN (call);
2426
 
2427
          /* If there is none, do nothing special,
2428
             since ordinary death handling can understand these insns.  */
2429
          if (call == insn)
2430
            return 0;
2431
 
2432
          /* See if the hard reg holding the value is dead.
2433
             If this is a PARALLEL, find the call within it.  */
2434
          call_pat = PATTERN (call);
2435
          if (GET_CODE (call_pat) == PARALLEL)
2436
            {
2437
              for (i = XVECLEN (call_pat, 0) - 1; i >= 0; i--)
2438
                if (GET_CODE (XVECEXP (call_pat, 0, i)) == SET
2439
                    && GET_CODE (SET_SRC (XVECEXP (call_pat, 0, i))) == CALL)
2440
                  break;
2441
 
2442
              /* This may be a library call that is returning a value
2443
                 via invisible pointer.  Do nothing special, since
2444
                 ordinary death handling can understand these insns.  */
2445
              if (i < 0)
2446
                return 0;
2447
 
2448
              call_pat = XVECEXP (call_pat, 0, i);
2449
            }
2450
 
2451
          if (! insn_dead_p (pbi, call_pat, 1, REG_NOTES (call)))
2452
            return 0;
2453
 
2454
          while ((insn = PREV_INSN (insn)) != call)
2455
            {
2456
              if (! INSN_P (insn))
2457
                continue;
2458
              if (! insn_dead_p (pbi, PATTERN (insn), 0, REG_NOTES (insn)))
2459
                return 0;
2460
            }
2461
          return 1;
2462
        }
2463
    }
2464
  return 0;
2465
}
2466
 
2467
/* 1 if register REGNO was alive at a place where `setjmp' was called
2468
   and was set more than once or is an argument.
2469
   Such regs may be clobbered by `longjmp'.  */
2470
 
2471
int
2472
regno_clobbered_at_setjmp (int regno)
2473
{
2474
  if (n_basic_blocks == 0)
2475
    return 0;
2476
 
2477
  return ((REG_N_SETS (regno) > 1
2478
           || REGNO_REG_SET_P (ENTRY_BLOCK_PTR->il.rtl->global_live_at_end,
2479
                               regno))
2480
          && REGNO_REG_SET_P (regs_live_at_setjmp, regno));
2481
}
2482
 
2483
/* Add MEM to PBI->MEM_SET_LIST.  MEM should be canonical.  Respect the
2484
   maximal list size; look for overlaps in mode and select the largest.  */
2485
static void
2486
add_to_mem_set_list (struct propagate_block_info *pbi, rtx mem)
2487
{
2488
  rtx i;
2489
 
2490
  /* We don't know how large a BLKmode store is, so we must not
2491
     take them into consideration.  */
2492
  if (GET_MODE (mem) == BLKmode)
2493
    return;
2494
 
2495
  for (i = pbi->mem_set_list; i ; i = XEXP (i, 1))
2496
    {
2497
      rtx e = XEXP (i, 0);
2498
      if (rtx_equal_p (XEXP (mem, 0), XEXP (e, 0)))
2499
        {
2500
          if (GET_MODE_SIZE (GET_MODE (mem)) > GET_MODE_SIZE (GET_MODE (e)))
2501
            {
2502
#ifdef AUTO_INC_DEC
2503
              /* If we must store a copy of the mem, we can just modify
2504
                 the mode of the stored copy.  */
2505
              if (pbi->flags & PROP_AUTOINC)
2506
                PUT_MODE (e, GET_MODE (mem));
2507
              else
2508
#endif
2509
                XEXP (i, 0) = mem;
2510
            }
2511
          return;
2512
        }
2513
    }
2514
 
2515
  if (pbi->mem_set_list_len < PARAM_VALUE (PARAM_MAX_FLOW_MEMORY_LOCATIONS))
2516
    {
2517
#ifdef AUTO_INC_DEC
2518
      /* Store a copy of mem, otherwise the address may be
2519
         scrogged by find_auto_inc.  */
2520
      if (pbi->flags & PROP_AUTOINC)
2521
        mem = shallow_copy_rtx (mem);
2522
#endif
2523
      pbi->mem_set_list = alloc_EXPR_LIST (0, mem, pbi->mem_set_list);
2524
      pbi->mem_set_list_len++;
2525
    }
2526
}
2527
 
2528
/* INSN references memory, possibly using autoincrement addressing modes.
2529
   Find any entries on the mem_set_list that need to be invalidated due
2530
   to an address change.  */
2531
 
2532
static int
2533
invalidate_mems_from_autoinc (rtx *px, void *data)
2534
{
2535
  rtx x = *px;
2536
  struct propagate_block_info *pbi = data;
2537
 
2538
  if (GET_RTX_CLASS (GET_CODE (x)) == RTX_AUTOINC)
2539
    {
2540
      invalidate_mems_from_set (pbi, XEXP (x, 0));
2541
      return -1;
2542
    }
2543
 
2544
  return 0;
2545
}
2546
 
2547
/* EXP is a REG or MEM.  Remove any dependent entries from
2548
   pbi->mem_set_list.  */
2549
 
2550
static void
2551
invalidate_mems_from_set (struct propagate_block_info *pbi, rtx exp)
2552
{
2553
  rtx temp = pbi->mem_set_list;
2554
  rtx prev = NULL_RTX;
2555
  rtx next;
2556
 
2557
  while (temp)
2558
    {
2559
      next = XEXP (temp, 1);
2560
      if ((REG_P (exp) && reg_overlap_mentioned_p (exp, XEXP (temp, 0)))
2561
          /* When we get an EXP that is a mem here, we want to check if EXP
2562
             overlaps the *address* of any of the mems in the list (i.e. not
2563
             whether the mems actually overlap; that's done elsewhere).  */
2564
          || (MEM_P (exp)
2565
              && reg_overlap_mentioned_p (exp, XEXP (XEXP (temp, 0), 0))))
2566
        {
2567
          /* Splice this entry out of the list.  */
2568
          if (prev)
2569
            XEXP (prev, 1) = next;
2570
          else
2571
            pbi->mem_set_list = next;
2572
          free_EXPR_LIST_node (temp);
2573
          pbi->mem_set_list_len--;
2574
        }
2575
      else
2576
        prev = temp;
2577
      temp = next;
2578
    }
2579
}
2580
 
2581
/* Process the registers that are set within X.  Their bits are set to
2582
   1 in the regset DEAD, because they are dead prior to this insn.
2583
 
2584
   If INSN is nonzero, it is the insn being processed.
2585
 
2586
   FLAGS is the set of operations to perform.  */
2587
 
2588
static void
2589
mark_set_regs (struct propagate_block_info *pbi, rtx x, rtx insn)
2590
{
2591
  rtx cond = NULL_RTX;
2592
  rtx link;
2593
  enum rtx_code code;
2594
  int flags = pbi->flags;
2595
 
2596
  if (insn)
2597
    for (link = REG_NOTES (insn); link; link = XEXP (link, 1))
2598
      {
2599
        if (REG_NOTE_KIND (link) == REG_INC)
2600
          mark_set_1 (pbi, SET, XEXP (link, 0),
2601
                      (GET_CODE (x) == COND_EXEC
2602
                       ? COND_EXEC_TEST (x) : NULL_RTX),
2603
                      insn, flags);
2604
      }
2605
 retry:
2606
  switch (code = GET_CODE (x))
2607
    {
2608
    case SET:
2609
      if (GET_CODE (XEXP (x, 1)) == ASM_OPERANDS)
2610
        flags |= PROP_ASM_SCAN;
2611
      /* Fall through */
2612
    case CLOBBER:
2613
      mark_set_1 (pbi, code, SET_DEST (x), cond, insn, flags);
2614
      return;
2615
 
2616
    case COND_EXEC:
2617
      cond = COND_EXEC_TEST (x);
2618
      x = COND_EXEC_CODE (x);
2619
      goto retry;
2620
 
2621
    case PARALLEL:
2622
      {
2623
        int i;
2624
 
2625
        /* We must scan forwards.  If we have an asm, we need to set
2626
           the PROP_ASM_SCAN flag before scanning the clobbers.  */
2627
        for (i = 0; i < XVECLEN (x, 0); i++)
2628
          {
2629
            rtx sub = XVECEXP (x, 0, i);
2630
            switch (code = GET_CODE (sub))
2631
              {
2632
              case COND_EXEC:
2633
                gcc_assert (!cond);
2634
 
2635
                cond = COND_EXEC_TEST (sub);
2636
                sub = COND_EXEC_CODE (sub);
2637
                if (GET_CODE (sub) == SET)
2638
                  goto mark_set;
2639
                if (GET_CODE (sub) == CLOBBER)
2640
                  goto mark_clob;
2641
                break;
2642
 
2643
              case SET:
2644
              mark_set:
2645
                if (GET_CODE (XEXP (sub, 1)) == ASM_OPERANDS)
2646
                  flags |= PROP_ASM_SCAN;
2647
                /* Fall through */
2648
              case CLOBBER:
2649
              mark_clob:
2650
                mark_set_1 (pbi, code, SET_DEST (sub), cond, insn, flags);
2651
                break;
2652
 
2653
              case ASM_OPERANDS:
2654
                flags |= PROP_ASM_SCAN;
2655
                break;
2656
 
2657
              default:
2658
                break;
2659
              }
2660
          }
2661
        break;
2662
      }
2663
 
2664
    default:
2665
      break;
2666
    }
2667
}
2668
 
2669
/* Process a single set, which appears in INSN.  REG (which may not
2670
   actually be a REG, it may also be a SUBREG, PARALLEL, etc.) is
2671
   being set using the CODE (which may be SET, CLOBBER, or COND_EXEC).
2672
   If the set is conditional (because it appear in a COND_EXEC), COND
2673
   will be the condition.  */
2674
 
2675
static void
2676
mark_set_1 (struct propagate_block_info *pbi, enum rtx_code code, rtx reg, rtx cond, rtx insn, int flags)
2677
{
2678
  int regno_first = -1, regno_last = -1;
2679
  unsigned long not_dead = 0;
2680
  int i;
2681
 
2682
  /* Modifying just one hardware register of a multi-reg value or just a
2683
     byte field of a register does not mean the value from before this insn
2684
     is now dead.  Of course, if it was dead after it's unused now.  */
2685
 
2686
  switch (GET_CODE (reg))
2687
    {
2688
    case PARALLEL:
2689
      /* Some targets place small structures in registers for return values of
2690
         functions.  We have to detect this case specially here to get correct
2691
         flow information.  */
2692
      for (i = XVECLEN (reg, 0) - 1; i >= 0; i--)
2693
        if (XEXP (XVECEXP (reg, 0, i), 0) != 0)
2694
          mark_set_1 (pbi, code, XEXP (XVECEXP (reg, 0, i), 0), cond, insn,
2695
                      flags);
2696
      return;
2697
 
2698
    case SIGN_EXTRACT:
2699
      /* SIGN_EXTRACT cannot be an lvalue.  */
2700
      gcc_unreachable ();
2701
 
2702
    case ZERO_EXTRACT:
2703
    case STRICT_LOW_PART:
2704
      /* ??? Assumes STRICT_LOW_PART not used on multi-word registers.  */
2705
      do
2706
        reg = XEXP (reg, 0);
2707
      while (GET_CODE (reg) == SUBREG
2708
             || GET_CODE (reg) == ZERO_EXTRACT
2709
             || GET_CODE (reg) == STRICT_LOW_PART);
2710
      if (MEM_P (reg))
2711
        break;
2712
      not_dead = (unsigned long) REGNO_REG_SET_P (pbi->reg_live, REGNO (reg));
2713
      /* Fall through.  */
2714
 
2715
    case REG:
2716
      regno_last = regno_first = REGNO (reg);
2717
      if (regno_first < FIRST_PSEUDO_REGISTER)
2718
        regno_last += hard_regno_nregs[regno_first][GET_MODE (reg)] - 1;
2719
      break;
2720
 
2721
    case SUBREG:
2722
      if (REG_P (SUBREG_REG (reg)))
2723
        {
2724
          enum machine_mode outer_mode = GET_MODE (reg);
2725
          enum machine_mode inner_mode = GET_MODE (SUBREG_REG (reg));
2726
 
2727
          /* Identify the range of registers affected.  This is moderately
2728
             tricky for hard registers.  See alter_subreg.  */
2729
 
2730
          regno_last = regno_first = REGNO (SUBREG_REG (reg));
2731
          if (regno_first < FIRST_PSEUDO_REGISTER)
2732
            {
2733
              regno_first += subreg_regno_offset (regno_first, inner_mode,
2734
                                                  SUBREG_BYTE (reg),
2735
                                                  outer_mode);
2736
              regno_last = (regno_first
2737
                            + hard_regno_nregs[regno_first][outer_mode] - 1);
2738
 
2739
              /* Since we've just adjusted the register number ranges, make
2740
                 sure REG matches.  Otherwise some_was_live will be clear
2741
                 when it shouldn't have been, and we'll create incorrect
2742
                 REG_UNUSED notes.  */
2743
              reg = gen_rtx_REG (outer_mode, regno_first);
2744
            }
2745
          else
2746
            {
2747
              /* If the number of words in the subreg is less than the number
2748
                 of words in the full register, we have a well-defined partial
2749
                 set.  Otherwise the high bits are undefined.
2750
 
2751
                 This is only really applicable to pseudos, since we just took
2752
                 care of multi-word hard registers.  */
2753
              if (((GET_MODE_SIZE (outer_mode)
2754
                    + UNITS_PER_WORD - 1) / UNITS_PER_WORD)
2755
                  < ((GET_MODE_SIZE (inner_mode)
2756
                      + UNITS_PER_WORD - 1) / UNITS_PER_WORD))
2757
                not_dead = (unsigned long) REGNO_REG_SET_P (pbi->reg_live,
2758
                                                            regno_first);
2759
 
2760
              reg = SUBREG_REG (reg);
2761
            }
2762
        }
2763
      else
2764
        reg = SUBREG_REG (reg);
2765
      break;
2766
 
2767
    default:
2768
      break;
2769
    }
2770
 
2771
  /* If this set is a MEM, then it kills any aliased writes and any
2772
     other MEMs which use it.
2773
     If this set is a REG, then it kills any MEMs which use the reg.  */
2774
  if (optimize && (flags & PROP_SCAN_DEAD_STORES))
2775
    {
2776
      if (REG_P (reg) || MEM_P (reg))
2777
        invalidate_mems_from_set (pbi, reg);
2778
 
2779
      /* If the memory reference had embedded side effects (autoincrement
2780
         address modes) then we may need to kill some entries on the
2781
         memory set list.  */
2782
      if (insn && MEM_P (reg))
2783
        for_each_rtx (&PATTERN (insn), invalidate_mems_from_autoinc, pbi);
2784
 
2785
      if (MEM_P (reg) && ! side_effects_p (reg)
2786
          /* ??? With more effort we could track conditional memory life.  */
2787
          && ! cond)
2788
        add_to_mem_set_list (pbi, canon_rtx (reg));
2789
    }
2790
 
2791
  if (REG_P (reg)
2792
      && ! (regno_first == FRAME_POINTER_REGNUM
2793
            && (! reload_completed || frame_pointer_needed))
2794
#if FRAME_POINTER_REGNUM != HARD_FRAME_POINTER_REGNUM
2795
      && ! (regno_first == HARD_FRAME_POINTER_REGNUM
2796
            && (! reload_completed || frame_pointer_needed))
2797
#endif
2798
#if FRAME_POINTER_REGNUM != ARG_POINTER_REGNUM
2799
      && ! (regno_first == ARG_POINTER_REGNUM && fixed_regs[regno_first])
2800
#endif
2801
      )
2802
    {
2803
      int some_was_live = 0, some_was_dead = 0;
2804
 
2805
      for (i = regno_first; i <= regno_last; ++i)
2806
        {
2807
          int needed_regno = REGNO_REG_SET_P (pbi->reg_live, i);
2808
          if (pbi->local_set)
2809
            {
2810
              /* Order of the set operation matters here since both
2811
                 sets may be the same.  */
2812
              CLEAR_REGNO_REG_SET (pbi->cond_local_set, i);
2813
              if (cond != NULL_RTX
2814
                  && ! REGNO_REG_SET_P (pbi->local_set, i))
2815
                SET_REGNO_REG_SET (pbi->cond_local_set, i);
2816
              else
2817
                SET_REGNO_REG_SET (pbi->local_set, i);
2818
            }
2819
          if (code != CLOBBER || needed_regno)
2820
            SET_REGNO_REG_SET (pbi->new_set, i);
2821
 
2822
          some_was_live |= needed_regno;
2823
          some_was_dead |= ! needed_regno;
2824
        }
2825
 
2826
#ifdef HAVE_conditional_execution
2827
      /* Consider conditional death in deciding that the register needs
2828
         a death note.  */
2829
      if (some_was_live && ! not_dead
2830
          /* The stack pointer is never dead.  Well, not strictly true,
2831
             but it's very difficult to tell from here.  Hopefully
2832
             combine_stack_adjustments will fix up the most egregious
2833
             errors.  */
2834
          && regno_first != STACK_POINTER_REGNUM)
2835
        {
2836
          for (i = regno_first; i <= regno_last; ++i)
2837
            if (! mark_regno_cond_dead (pbi, i, cond))
2838
              not_dead |= ((unsigned long) 1) << (i - regno_first);
2839
        }
2840
#endif
2841
 
2842
      /* Additional data to record if this is the final pass.  */
2843
      if (flags & (PROP_LOG_LINKS | PROP_REG_INFO
2844
                   | PROP_DEATH_NOTES | PROP_AUTOINC))
2845
        {
2846
          rtx y;
2847
          int blocknum = pbi->bb->index;
2848
 
2849
          y = NULL_RTX;
2850
          if (flags & (PROP_LOG_LINKS | PROP_AUTOINC))
2851
            {
2852
              y = pbi->reg_next_use[regno_first];
2853
 
2854
              /* The next use is no longer next, since a store intervenes.  */
2855
              for (i = regno_first; i <= regno_last; ++i)
2856
                pbi->reg_next_use[i] = 0;
2857
            }
2858
 
2859
          if (flags & PROP_REG_INFO)
2860
            {
2861
              for (i = regno_first; i <= regno_last; ++i)
2862
                {
2863
                  /* Count (weighted) references, stores, etc.  This counts a
2864
                     register twice if it is modified, but that is correct.  */
2865
                  REG_N_SETS (i) += 1;
2866
                  REG_N_REFS (i) += 1;
2867
                  REG_FREQ (i) += REG_FREQ_FROM_BB (pbi->bb);
2868
 
2869
                  /* The insns where a reg is live are normally counted
2870
                     elsewhere, but we want the count to include the insn
2871
                     where the reg is set, and the normal counting mechanism
2872
                     would not count it.  */
2873
                  REG_LIVE_LENGTH (i) += 1;
2874
                }
2875
 
2876
              /* If this is a hard reg, record this function uses the reg.  */
2877
              if (regno_first < FIRST_PSEUDO_REGISTER)
2878
                {
2879
                  for (i = regno_first; i <= regno_last; i++)
2880
                    regs_ever_live[i] = 1;
2881
                  if (flags & PROP_ASM_SCAN)
2882
                    for (i = regno_first; i <= regno_last; i++)
2883
                      regs_asm_clobbered[i] = 1;
2884
                }
2885
              else
2886
                {
2887
                  /* Keep track of which basic blocks each reg appears in.  */
2888
                  if (REG_BASIC_BLOCK (regno_first) == REG_BLOCK_UNKNOWN)
2889
                    REG_BASIC_BLOCK (regno_first) = blocknum;
2890
                  else if (REG_BASIC_BLOCK (regno_first) != blocknum)
2891
                    REG_BASIC_BLOCK (regno_first) = REG_BLOCK_GLOBAL;
2892
                }
2893
            }
2894
 
2895
          if (! some_was_dead)
2896
            {
2897
              if (flags & PROP_LOG_LINKS)
2898
                {
2899
                  /* Make a logical link from the next following insn
2900
                     that uses this register, back to this insn.
2901
                     The following insns have already been processed.
2902
 
2903
                     We don't build a LOG_LINK for hard registers containing
2904
                     in ASM_OPERANDs.  If these registers get replaced,
2905
                     we might wind up changing the semantics of the insn,
2906
                     even if reload can make what appear to be valid
2907
                     assignments later.
2908
 
2909
                     We don't build a LOG_LINK for global registers to
2910
                     or from a function call.  We don't want to let
2911
                     combine think that it knows what is going on with
2912
                     global registers.  */
2913
                  if (y && (BLOCK_NUM (y) == blocknum)
2914
                      && (regno_first >= FIRST_PSEUDO_REGISTER
2915
                          || (asm_noperands (PATTERN (y)) < 0
2916
                              && ! ((CALL_P (insn)
2917
                                     || CALL_P (y))
2918
                                    && global_regs[regno_first]))))
2919
                    LOG_LINKS (y) = alloc_INSN_LIST (insn, LOG_LINKS (y));
2920
                }
2921
            }
2922
          else if (not_dead)
2923
            ;
2924
          else if (! some_was_live)
2925
            {
2926
              if (flags & PROP_REG_INFO)
2927
                REG_N_DEATHS (regno_first) += 1;
2928
 
2929
              if (flags & PROP_DEATH_NOTES)
2930
                {
2931
                  /* Note that dead stores have already been deleted
2932
                     when possible.  If we get here, we have found a
2933
                     dead store that cannot be eliminated (because the
2934
                     same insn does something useful).  Indicate this
2935
                     by marking the reg being set as dying here.  */
2936
                  REG_NOTES (insn)
2937
                    = alloc_EXPR_LIST (REG_UNUSED, reg, REG_NOTES (insn));
2938
                }
2939
            }
2940
          else
2941
            {
2942
              if (flags & PROP_DEATH_NOTES)
2943
                {
2944
                  /* This is a case where we have a multi-word hard register
2945
                     and some, but not all, of the words of the register are
2946
                     needed in subsequent insns.  Write REG_UNUSED notes
2947
                     for those parts that were not needed.  This case should
2948
                     be rare.  */
2949
 
2950
                  for (i = regno_first; i <= regno_last; ++i)
2951
                    if (! REGNO_REG_SET_P (pbi->reg_live, i))
2952
                      REG_NOTES (insn)
2953
                        = alloc_EXPR_LIST (REG_UNUSED,
2954
                                           regno_reg_rtx[i],
2955
                                           REG_NOTES (insn));
2956
                }
2957
            }
2958
        }
2959
 
2960
      /* Mark the register as being dead.  */
2961
      if (some_was_live
2962
          /* The stack pointer is never dead.  Well, not strictly true,
2963
             but it's very difficult to tell from here.  Hopefully
2964
             combine_stack_adjustments will fix up the most egregious
2965
             errors.  */
2966
          && regno_first != STACK_POINTER_REGNUM)
2967
        {
2968
          for (i = regno_first; i <= regno_last; ++i)
2969
            if (!(not_dead & (((unsigned long) 1) << (i - regno_first))))
2970
              {
2971
                if ((pbi->flags & PROP_REG_INFO)
2972
                    && REGNO_REG_SET_P (pbi->reg_live, i))
2973
                  {
2974
                    REG_LIVE_LENGTH (i) += pbi->insn_num - reg_deaths[i];
2975
                    reg_deaths[i] = 0;
2976
                  }
2977
                CLEAR_REGNO_REG_SET (pbi->reg_live, i);
2978
              }
2979
          if (flags & PROP_DEAD_INSN)
2980
            emit_insn_after (gen_rtx_CLOBBER (VOIDmode, reg), insn);
2981
        }
2982
    }
2983
  else if (REG_P (reg))
2984
    {
2985
      if (flags & (PROP_LOG_LINKS | PROP_AUTOINC))
2986
        pbi->reg_next_use[regno_first] = 0;
2987
 
2988
      if ((flags & PROP_REG_INFO) != 0
2989
          && (flags & PROP_ASM_SCAN) != 0
2990
          &&  regno_first < FIRST_PSEUDO_REGISTER)
2991
        {
2992
          for (i = regno_first; i <= regno_last; i++)
2993
            regs_asm_clobbered[i] = 1;
2994
        }
2995
    }
2996
 
2997
  /* If this is the last pass and this is a SCRATCH, show it will be dying
2998
     here and count it.  */
2999
  else if (GET_CODE (reg) == SCRATCH)
3000
    {
3001
      if (flags & PROP_DEATH_NOTES)
3002
        REG_NOTES (insn)
3003
          = alloc_EXPR_LIST (REG_UNUSED, reg, REG_NOTES (insn));
3004
    }
3005
}
3006
 
3007
#ifdef HAVE_conditional_execution
3008
/* Mark REGNO conditionally dead.
3009
   Return true if the register is now unconditionally dead.  */
3010
 
3011
static int
3012
mark_regno_cond_dead (struct propagate_block_info *pbi, int regno, rtx cond)
3013
{
3014
  /* If this is a store to a predicate register, the value of the
3015
     predicate is changing, we don't know that the predicate as seen
3016
     before is the same as that seen after.  Flush all dependent
3017
     conditions from reg_cond_dead.  This will make all such
3018
     conditionally live registers unconditionally live.  */
3019
  if (REGNO_REG_SET_P (pbi->reg_cond_reg, regno))
3020
    flush_reg_cond_reg (pbi, regno);
3021
 
3022
  /* If this is an unconditional store, remove any conditional
3023
     life that may have existed.  */
3024
  if (cond == NULL_RTX)
3025
    splay_tree_remove (pbi->reg_cond_dead, regno);
3026
  else
3027
    {
3028
      splay_tree_node node;
3029
      struct reg_cond_life_info *rcli;
3030
      rtx ncond;
3031
 
3032
      /* Otherwise this is a conditional set.  Record that fact.
3033
         It may have been conditionally used, or there may be a
3034
         subsequent set with a complementary condition.  */
3035
 
3036
      node = splay_tree_lookup (pbi->reg_cond_dead, regno);
3037
      if (node == NULL)
3038
        {
3039
          /* The register was unconditionally live previously.
3040
             Record the current condition as the condition under
3041
             which it is dead.  */
3042
          rcli = xmalloc (sizeof (*rcli));
3043
          rcli->condition = cond;
3044
          rcli->stores = cond;
3045
          rcli->orig_condition = const0_rtx;
3046
          splay_tree_insert (pbi->reg_cond_dead, regno,
3047
                             (splay_tree_value) rcli);
3048
 
3049
          SET_REGNO_REG_SET (pbi->reg_cond_reg, REGNO (XEXP (cond, 0)));
3050
 
3051
          /* Not unconditionally dead.  */
3052
          return 0;
3053
        }
3054
      else
3055
        {
3056
          /* The register was conditionally live previously.
3057
             Add the new condition to the old.  */
3058
          rcli = (struct reg_cond_life_info *) node->value;
3059
          ncond = rcli->condition;
3060
          ncond = ior_reg_cond (ncond, cond, 1);
3061
          if (rcli->stores == const0_rtx)
3062
            rcli->stores = cond;
3063
          else if (rcli->stores != const1_rtx)
3064
            rcli->stores = ior_reg_cond (rcli->stores, cond, 1);
3065
 
3066
          /* If the register is now unconditionally dead, remove the entry
3067
             in the splay_tree.  A register is unconditionally dead if the
3068
             dead condition ncond is true.  A register is also unconditionally
3069
             dead if the sum of all conditional stores is an unconditional
3070
             store (stores is true), and the dead condition is identically the
3071
             same as the original dead condition initialized at the end of
3072
             the block.  This is a pointer compare, not an rtx_equal_p
3073
             compare.  */
3074
          if (ncond == const1_rtx
3075
              || (ncond == rcli->orig_condition && rcli->stores == const1_rtx))
3076
            splay_tree_remove (pbi->reg_cond_dead, regno);
3077
          else
3078
            {
3079
              rcli->condition = ncond;
3080
 
3081
              SET_REGNO_REG_SET (pbi->reg_cond_reg, REGNO (XEXP (cond, 0)));
3082
 
3083
              /* Not unconditionally dead.  */
3084
              return 0;
3085
            }
3086
        }
3087
    }
3088
 
3089
  return 1;
3090
}
3091
 
3092
/* Called from splay_tree_delete for pbi->reg_cond_life.  */
3093
 
3094
static void
3095
free_reg_cond_life_info (splay_tree_value value)
3096
{
3097
  struct reg_cond_life_info *rcli = (struct reg_cond_life_info *) value;
3098
  free (rcli);
3099
}
3100
 
3101
/* Helper function for flush_reg_cond_reg.  */
3102
 
3103
static int
3104
flush_reg_cond_reg_1 (splay_tree_node node, void *data)
3105
{
3106
  struct reg_cond_life_info *rcli;
3107
  int *xdata = (int *) data;
3108
  unsigned int regno = xdata[0];
3109
 
3110
  /* Don't need to search if last flushed value was farther on in
3111
     the in-order traversal.  */
3112
  if (xdata[1] >= (int) node->key)
3113
    return 0;
3114
 
3115
  /* Splice out portions of the expression that refer to regno.  */
3116
  rcli = (struct reg_cond_life_info *) node->value;
3117
  rcli->condition = elim_reg_cond (rcli->condition, regno);
3118
  if (rcli->stores != const0_rtx && rcli->stores != const1_rtx)
3119
    rcli->stores = elim_reg_cond (rcli->stores, regno);
3120
 
3121
  /* If the entire condition is now false, signal the node to be removed.  */
3122
  if (rcli->condition == const0_rtx)
3123
    {
3124
      xdata[1] = node->key;
3125
      return -1;
3126
    }
3127
  else
3128
    gcc_assert (rcli->condition != const1_rtx);
3129
 
3130
  return 0;
3131
}
3132
 
3133
/* Flush all (sub) expressions referring to REGNO from REG_COND_LIVE.  */
3134
 
3135
static void
3136
flush_reg_cond_reg (struct propagate_block_info *pbi, int regno)
3137
{
3138
  int pair[2];
3139
 
3140
  pair[0] = regno;
3141
  pair[1] = -1;
3142
  while (splay_tree_foreach (pbi->reg_cond_dead,
3143
                             flush_reg_cond_reg_1, pair) == -1)
3144
    splay_tree_remove (pbi->reg_cond_dead, pair[1]);
3145
 
3146
  CLEAR_REGNO_REG_SET (pbi->reg_cond_reg, regno);
3147
}
3148
 
3149
/* Logical arithmetic on predicate conditions.  IOR, NOT and AND.
3150
   For ior/and, the ADD flag determines whether we want to add the new
3151
   condition X to the old one unconditionally.  If it is zero, we will
3152
   only return a new expression if X allows us to simplify part of
3153
   OLD, otherwise we return NULL to the caller.
3154
   If ADD is nonzero, we will return a new condition in all cases.  The
3155
   toplevel caller of one of these functions should always pass 1 for
3156
   ADD.  */
3157
 
3158
static rtx
3159
ior_reg_cond (rtx old, rtx x, int add)
3160
{
3161
  rtx op0, op1;
3162
 
3163
  if (COMPARISON_P (old))
3164
    {
3165
      if (COMPARISON_P (x)
3166
          && REVERSE_CONDEXEC_PREDICATES_P (x, old)
3167
          && REGNO (XEXP (x, 0)) == REGNO (XEXP (old, 0)))
3168
        return const1_rtx;
3169
      if (GET_CODE (x) == GET_CODE (old)
3170
          && REGNO (XEXP (x, 0)) == REGNO (XEXP (old, 0)))
3171
        return old;
3172
      if (! add)
3173
        return NULL;
3174
      return gen_rtx_IOR (0, old, x);
3175
    }
3176
 
3177
  switch (GET_CODE (old))
3178
    {
3179
    case IOR:
3180
      op0 = ior_reg_cond (XEXP (old, 0), x, 0);
3181
      op1 = ior_reg_cond (XEXP (old, 1), x, 0);
3182
      if (op0 != NULL || op1 != NULL)
3183
        {
3184
          if (op0 == const0_rtx)
3185
            return op1 ? op1 : gen_rtx_IOR (0, XEXP (old, 1), x);
3186
          if (op1 == const0_rtx)
3187
            return op0 ? op0 : gen_rtx_IOR (0, XEXP (old, 0), x);
3188
          if (op0 == const1_rtx || op1 == const1_rtx)
3189
            return const1_rtx;
3190
          if (op0 == NULL)
3191
            op0 = gen_rtx_IOR (0, XEXP (old, 0), x);
3192
          else if (rtx_equal_p (x, op0))
3193
            /* (x | A) | x ~ (x | A).  */
3194
            return old;
3195
          if (op1 == NULL)
3196
            op1 = gen_rtx_IOR (0, XEXP (old, 1), x);
3197
          else if (rtx_equal_p (x, op1))
3198
            /* (A | x) | x ~ (A | x).  */
3199
            return old;
3200
          return gen_rtx_IOR (0, op0, op1);
3201
        }
3202
      if (! add)
3203
        return NULL;
3204
      return gen_rtx_IOR (0, old, x);
3205
 
3206
    case AND:
3207
      op0 = ior_reg_cond (XEXP (old, 0), x, 0);
3208
      op1 = ior_reg_cond (XEXP (old, 1), x, 0);
3209
      if (op0 != NULL || op1 != NULL)
3210
        {
3211
          if (op0 == const1_rtx)
3212
            return op1 ? op1 : gen_rtx_IOR (0, XEXP (old, 1), x);
3213
          if (op1 == const1_rtx)
3214
            return op0 ? op0 : gen_rtx_IOR (0, XEXP (old, 0), x);
3215
          if (op0 == const0_rtx || op1 == const0_rtx)
3216
            return const0_rtx;
3217
          if (op0 == NULL)
3218
            op0 = gen_rtx_IOR (0, XEXP (old, 0), x);
3219
          else if (rtx_equal_p (x, op0))
3220
            /* (x & A) | x ~ x.  */
3221
            return op0;
3222
          if (op1 == NULL)
3223
            op1 = gen_rtx_IOR (0, XEXP (old, 1), x);
3224
          else if (rtx_equal_p (x, op1))
3225
            /* (A & x) | x ~ x.  */
3226
            return op1;
3227
          return gen_rtx_AND (0, op0, op1);
3228
        }
3229
      if (! add)
3230
        return NULL;
3231
      return gen_rtx_IOR (0, old, x);
3232
 
3233
    case NOT:
3234
      op0 = and_reg_cond (XEXP (old, 0), not_reg_cond (x), 0);
3235
      if (op0 != NULL)
3236
        return not_reg_cond (op0);
3237
      if (! add)
3238
        return NULL;
3239
      return gen_rtx_IOR (0, old, x);
3240
 
3241
    default:
3242
      gcc_unreachable ();
3243
    }
3244
}
3245
 
3246
static rtx
3247
not_reg_cond (rtx x)
3248
{
3249
  if (x == const0_rtx)
3250
    return const1_rtx;
3251
  else if (x == const1_rtx)
3252
    return const0_rtx;
3253
  if (GET_CODE (x) == NOT)
3254
    return XEXP (x, 0);
3255
  if (COMPARISON_P (x)
3256
      && REG_P (XEXP (x, 0)))
3257
    {
3258
      gcc_assert (XEXP (x, 1) == const0_rtx);
3259
 
3260
      return gen_rtx_fmt_ee (reversed_comparison_code (x, NULL),
3261
                             VOIDmode, XEXP (x, 0), const0_rtx);
3262
    }
3263
  return gen_rtx_NOT (0, x);
3264
}
3265
 
3266
static rtx
3267
and_reg_cond (rtx old, rtx x, int add)
3268
{
3269
  rtx op0, op1;
3270
 
3271
  if (COMPARISON_P (old))
3272
    {
3273
      if (COMPARISON_P (x)
3274
          && GET_CODE (x) == reversed_comparison_code (old, NULL)
3275
          && REGNO (XEXP (x, 0)) == REGNO (XEXP (old, 0)))
3276
        return const0_rtx;
3277
      if (GET_CODE (x) == GET_CODE (old)
3278
          && REGNO (XEXP (x, 0)) == REGNO (XEXP (old, 0)))
3279
        return old;
3280
      if (! add)
3281
        return NULL;
3282
      return gen_rtx_AND (0, old, x);
3283
    }
3284
 
3285
  switch (GET_CODE (old))
3286
    {
3287
    case IOR:
3288
      op0 = and_reg_cond (XEXP (old, 0), x, 0);
3289
      op1 = and_reg_cond (XEXP (old, 1), x, 0);
3290
      if (op0 != NULL || op1 != NULL)
3291
        {
3292
          if (op0 == const0_rtx)
3293
            return op1 ? op1 : gen_rtx_AND (0, XEXP (old, 1), x);
3294
          if (op1 == const0_rtx)
3295
            return op0 ? op0 : gen_rtx_AND (0, XEXP (old, 0), x);
3296
          if (op0 == const1_rtx || op1 == const1_rtx)
3297
            return const1_rtx;
3298
          if (op0 == NULL)
3299
            op0 = gen_rtx_AND (0, XEXP (old, 0), x);
3300
          else if (rtx_equal_p (x, op0))
3301
            /* (x | A) & x ~ x.  */
3302
            return op0;
3303
          if (op1 == NULL)
3304
            op1 = gen_rtx_AND (0, XEXP (old, 1), x);
3305
          else if (rtx_equal_p (x, op1))
3306
            /* (A | x) & x ~ x.  */
3307
            return op1;
3308
          return gen_rtx_IOR (0, op0, op1);
3309
        }
3310
      if (! add)
3311
        return NULL;
3312
      return gen_rtx_AND (0, old, x);
3313
 
3314
    case AND:
3315
      op0 = and_reg_cond (XEXP (old, 0), x, 0);
3316
      op1 = and_reg_cond (XEXP (old, 1), x, 0);
3317
      if (op0 != NULL || op1 != NULL)
3318
        {
3319
          if (op0 == const1_rtx)
3320
            return op1 ? op1 : gen_rtx_AND (0, XEXP (old, 1), x);
3321
          if (op1 == const1_rtx)
3322
            return op0 ? op0 : gen_rtx_AND (0, XEXP (old, 0), x);
3323
          if (op0 == const0_rtx || op1 == const0_rtx)
3324
            return const0_rtx;
3325
          if (op0 == NULL)
3326
            op0 = gen_rtx_AND (0, XEXP (old, 0), x);
3327
          else if (rtx_equal_p (x, op0))
3328
            /* (x & A) & x ~ (x & A).  */
3329
            return old;
3330
          if (op1 == NULL)
3331
            op1 = gen_rtx_AND (0, XEXP (old, 1), x);
3332
          else if (rtx_equal_p (x, op1))
3333
            /* (A & x) & x ~ (A & x).  */
3334
            return old;
3335
          return gen_rtx_AND (0, op0, op1);
3336
        }
3337
      if (! add)
3338
        return NULL;
3339
      return gen_rtx_AND (0, old, x);
3340
 
3341
    case NOT:
3342
      op0 = ior_reg_cond (XEXP (old, 0), not_reg_cond (x), 0);
3343
      if (op0 != NULL)
3344
        return not_reg_cond (op0);
3345
      if (! add)
3346
        return NULL;
3347
      return gen_rtx_AND (0, old, x);
3348
 
3349
    default:
3350
      gcc_unreachable ();
3351
    }
3352
}
3353
 
3354
/* Given a condition X, remove references to reg REGNO and return the
3355
   new condition.  The removal will be done so that all conditions
3356
   involving REGNO are considered to evaluate to false.  This function
3357
   is used when the value of REGNO changes.  */
3358
 
3359
static rtx
3360
elim_reg_cond (rtx x, unsigned int regno)
3361
{
3362
  rtx op0, op1;
3363
 
3364
  if (COMPARISON_P (x))
3365
    {
3366
      if (REGNO (XEXP (x, 0)) == regno)
3367
        return const0_rtx;
3368
      return x;
3369
    }
3370
 
3371
  switch (GET_CODE (x))
3372
    {
3373
    case AND:
3374
      op0 = elim_reg_cond (XEXP (x, 0), regno);
3375
      op1 = elim_reg_cond (XEXP (x, 1), regno);
3376
      if (op0 == const0_rtx || op1 == const0_rtx)
3377
        return const0_rtx;
3378
      if (op0 == const1_rtx)
3379
        return op1;
3380
      if (op1 == const1_rtx)
3381
        return op0;
3382
      if (op0 == XEXP (x, 0) && op1 == XEXP (x, 1))
3383
        return x;
3384
      return gen_rtx_AND (0, op0, op1);
3385
 
3386
    case IOR:
3387
      op0 = elim_reg_cond (XEXP (x, 0), regno);
3388
      op1 = elim_reg_cond (XEXP (x, 1), regno);
3389
      if (op0 == const1_rtx || op1 == const1_rtx)
3390
        return const1_rtx;
3391
      if (op0 == const0_rtx)
3392
        return op1;
3393
      if (op1 == const0_rtx)
3394
        return op0;
3395
      if (op0 == XEXP (x, 0) && op1 == XEXP (x, 1))
3396
        return x;
3397
      return gen_rtx_IOR (0, op0, op1);
3398
 
3399
    case NOT:
3400
      op0 = elim_reg_cond (XEXP (x, 0), regno);
3401
      if (op0 == const0_rtx)
3402
        return const1_rtx;
3403
      if (op0 == const1_rtx)
3404
        return const0_rtx;
3405
      if (op0 != XEXP (x, 0))
3406
        return not_reg_cond (op0);
3407
      return x;
3408
 
3409
    default:
3410
      gcc_unreachable ();
3411
    }
3412
}
3413
#endif /* HAVE_conditional_execution */
3414
 
3415
#ifdef AUTO_INC_DEC
3416
 
3417
/* Try to substitute the auto-inc expression INC as the address inside
3418
   MEM which occurs in INSN.  Currently, the address of MEM is an expression
3419
   involving INCR_REG, and INCR is the next use of INCR_REG; it is an insn
3420
   that has a single set whose source is a PLUS of INCR_REG and something
3421
   else.  */
3422
 
3423
static void
3424
attempt_auto_inc (struct propagate_block_info *pbi, rtx inc, rtx insn,
3425
                  rtx mem, rtx incr, rtx incr_reg)
3426
{
3427
  int regno = REGNO (incr_reg);
3428
  rtx set = single_set (incr);
3429
  rtx q = SET_DEST (set);
3430
  rtx y = SET_SRC (set);
3431
  int opnum = XEXP (y, 0) == incr_reg ? 0 : 1;
3432
  int changed;
3433
 
3434
  /* Make sure this reg appears only once in this insn.  */
3435
  if (count_occurrences (PATTERN (insn), incr_reg, 1) != 1)
3436
    return;
3437
 
3438
  if (dead_or_set_p (incr, incr_reg)
3439
      /* Mustn't autoinc an eliminable register.  */
3440
      && (regno >= FIRST_PSEUDO_REGISTER
3441
          || ! TEST_HARD_REG_BIT (elim_reg_set, regno)))
3442
    {
3443
      /* This is the simple case.  Try to make the auto-inc.  If
3444
         we can't, we are done.  Otherwise, we will do any
3445
         needed updates below.  */
3446
      if (! validate_change (insn, &XEXP (mem, 0), inc, 0))
3447
        return;
3448
    }
3449
  else if (REG_P (q)
3450
           /* PREV_INSN used here to check the semi-open interval
3451
              [insn,incr).  */
3452
           && ! reg_used_between_p (q,  PREV_INSN (insn), incr)
3453
           /* We must also check for sets of q as q may be
3454
              a call clobbered hard register and there may
3455
              be a call between PREV_INSN (insn) and incr.  */
3456
           && ! reg_set_between_p (q,  PREV_INSN (insn), incr))
3457
    {
3458
      /* We have *p followed sometime later by q = p+size.
3459
         Both p and q must be live afterward,
3460
         and q is not used between INSN and its assignment.
3461
         Change it to q = p, ...*q..., q = q+size.
3462
         Then fall into the usual case.  */
3463
      rtx insns, temp;
3464
 
3465
      start_sequence ();
3466
      emit_move_insn (q, incr_reg);
3467
      insns = get_insns ();
3468
      end_sequence ();
3469
 
3470
      /* If we can't make the auto-inc, or can't make the
3471
         replacement into Y, exit.  There's no point in making
3472
         the change below if we can't do the auto-inc and doing
3473
         so is not correct in the pre-inc case.  */
3474
 
3475
      XEXP (inc, 0) = q;
3476
      validate_change (insn, &XEXP (mem, 0), inc, 1);
3477
      validate_change (incr, &XEXP (y, opnum), q, 1);
3478
      if (! apply_change_group ())
3479
        return;
3480
 
3481
      /* We now know we'll be doing this change, so emit the
3482
         new insn(s) and do the updates.  */
3483
      emit_insn_before (insns, insn);
3484
 
3485
      if (BB_HEAD (pbi->bb) == insn)
3486
        BB_HEAD (pbi->bb) = insns;
3487
 
3488
      /* INCR will become a NOTE and INSN won't contain a
3489
         use of INCR_REG.  If a use of INCR_REG was just placed in
3490
         the insn before INSN, make that the next use.
3491
         Otherwise, invalidate it.  */
3492
      if (NONJUMP_INSN_P (PREV_INSN (insn))
3493
          && GET_CODE (PATTERN (PREV_INSN (insn))) == SET
3494
          && SET_SRC (PATTERN (PREV_INSN (insn))) == incr_reg)
3495
        pbi->reg_next_use[regno] = PREV_INSN (insn);
3496
      else
3497
        pbi->reg_next_use[regno] = 0;
3498
 
3499
      incr_reg = q;
3500
      regno = REGNO (q);
3501
 
3502
      if ((pbi->flags & PROP_REG_INFO)
3503
          && !REGNO_REG_SET_P (pbi->reg_live, regno))
3504
        reg_deaths[regno] = pbi->insn_num;
3505
 
3506
      /* REGNO is now used in INCR which is below INSN, but
3507
         it previously wasn't live here.  If we don't mark
3508
         it as live, we'll put a REG_DEAD note for it
3509
         on this insn, which is incorrect.  */
3510
      SET_REGNO_REG_SET (pbi->reg_live, regno);
3511
 
3512
      /* If there are any calls between INSN and INCR, show
3513
         that REGNO now crosses them.  */
3514
      for (temp = insn; temp != incr; temp = NEXT_INSN (temp))
3515
        if (CALL_P (temp))
3516
          {
3517
            REG_N_CALLS_CROSSED (regno)++;
3518
            if (can_throw_internal (temp))
3519
              REG_N_THROWING_CALLS_CROSSED (regno)++;
3520
          }
3521
 
3522
      /* Invalidate alias info for Q since we just changed its value.  */
3523
      clear_reg_alias_info (q);
3524
    }
3525
  else
3526
    return;
3527
 
3528
  /* If we haven't returned, it means we were able to make the
3529
     auto-inc, so update the status.  First, record that this insn
3530
     has an implicit side effect.  */
3531
 
3532
  REG_NOTES (insn) = alloc_EXPR_LIST (REG_INC, incr_reg, REG_NOTES (insn));
3533
 
3534
  /* Modify the old increment-insn to simply copy
3535
     the already-incremented value of our register.  */
3536
  changed = validate_change (incr, &SET_SRC (set), incr_reg, 0);
3537
  gcc_assert (changed);
3538
 
3539
  /* If that makes it a no-op (copying the register into itself) delete
3540
     it so it won't appear to be a "use" and a "set" of this
3541
     register.  */
3542
  if (REGNO (SET_DEST (set)) == REGNO (incr_reg))
3543
    {
3544
      /* If the original source was dead, it's dead now.  */
3545
      rtx note;
3546
 
3547
      while ((note = find_reg_note (incr, REG_DEAD, NULL_RTX)) != NULL_RTX)
3548
        {
3549
          remove_note (incr, note);
3550
          if (XEXP (note, 0) != incr_reg)
3551
            {
3552
              unsigned int regno = REGNO (XEXP (note, 0));
3553
 
3554
              if ((pbi->flags & PROP_REG_INFO)
3555
                  && REGNO_REG_SET_P (pbi->reg_live, regno))
3556
                {
3557
                  REG_LIVE_LENGTH (regno) += pbi->insn_num - reg_deaths[regno];
3558
                  reg_deaths[regno] = 0;
3559
                }
3560
              CLEAR_REGNO_REG_SET (pbi->reg_live, REGNO (XEXP (note, 0)));
3561
            }
3562
        }
3563
 
3564
      SET_INSN_DELETED (incr);
3565
    }
3566
 
3567
  if (regno >= FIRST_PSEUDO_REGISTER)
3568
    {
3569
      /* Count an extra reference to the reg.  When a reg is
3570
         incremented, spilling it is worse, so we want to make
3571
         that less likely.  */
3572
      REG_FREQ (regno) += REG_FREQ_FROM_BB (pbi->bb);
3573
 
3574
      /* Count the increment as a setting of the register,
3575
         even though it isn't a SET in rtl.  */
3576
      REG_N_SETS (regno)++;
3577
    }
3578
}
3579
 
3580
/* X is a MEM found in INSN.  See if we can convert it into an auto-increment
3581
   reference.  */
3582
 
3583
static void
3584
find_auto_inc (struct propagate_block_info *pbi, rtx x, rtx insn)
3585
{
3586
  rtx addr = XEXP (x, 0);
3587
  HOST_WIDE_INT offset = 0;
3588
  rtx set, y, incr, inc_val;
3589
  int regno;
3590
  int size = GET_MODE_SIZE (GET_MODE (x));
3591
 
3592
  if (JUMP_P (insn))
3593
    return;
3594
 
3595
  /* Here we detect use of an index register which might be good for
3596
     postincrement, postdecrement, preincrement, or predecrement.  */
3597
 
3598
  if (GET_CODE (addr) == PLUS && GET_CODE (XEXP (addr, 1)) == CONST_INT)
3599
    offset = INTVAL (XEXP (addr, 1)), addr = XEXP (addr, 0);
3600
 
3601
  if (!REG_P (addr))
3602
    return;
3603
 
3604
  regno = REGNO (addr);
3605
 
3606
  /* Is the next use an increment that might make auto-increment? */
3607
  incr = pbi->reg_next_use[regno];
3608
  if (incr == 0 || BLOCK_NUM (incr) != BLOCK_NUM (insn))
3609
    return;
3610
  set = single_set (incr);
3611
  if (set == 0 || GET_CODE (set) != SET)
3612
    return;
3613
  y = SET_SRC (set);
3614
 
3615
  if (GET_CODE (y) != PLUS)
3616
    return;
3617
 
3618
  if (REG_P (XEXP (y, 0)) && REGNO (XEXP (y, 0)) == REGNO (addr))
3619
    inc_val = XEXP (y, 1);
3620
  else if (REG_P (XEXP (y, 1)) && REGNO (XEXP (y, 1)) == REGNO (addr))
3621
    inc_val = XEXP (y, 0);
3622
  else
3623
    return;
3624
 
3625
  if (GET_CODE (inc_val) == CONST_INT)
3626
    {
3627
      if (HAVE_POST_INCREMENT
3628
          && (INTVAL (inc_val) == size && offset == 0))
3629
        attempt_auto_inc (pbi, gen_rtx_POST_INC (Pmode, addr), insn, x,
3630
                          incr, addr);
3631
      else if (HAVE_POST_DECREMENT
3632
               && (INTVAL (inc_val) == -size && offset == 0))
3633
        attempt_auto_inc (pbi, gen_rtx_POST_DEC (Pmode, addr), insn, x,
3634
                          incr, addr);
3635
      else if (HAVE_PRE_INCREMENT
3636
               && (INTVAL (inc_val) == size && offset == size))
3637
        attempt_auto_inc (pbi, gen_rtx_PRE_INC (Pmode, addr), insn, x,
3638
                          incr, addr);
3639
      else if (HAVE_PRE_DECREMENT
3640
               && (INTVAL (inc_val) == -size && offset == -size))
3641
        attempt_auto_inc (pbi, gen_rtx_PRE_DEC (Pmode, addr), insn, x,
3642
                          incr, addr);
3643
      else if (HAVE_POST_MODIFY_DISP && offset == 0)
3644
        attempt_auto_inc (pbi, gen_rtx_POST_MODIFY (Pmode, addr,
3645
                                                    gen_rtx_PLUS (Pmode,
3646
                                                                  addr,
3647
                                                                  inc_val)),
3648
                          insn, x, incr, addr);
3649
      else if (HAVE_PRE_MODIFY_DISP && offset == INTVAL (inc_val))
3650
        attempt_auto_inc (pbi, gen_rtx_PRE_MODIFY (Pmode, addr,
3651
                                                    gen_rtx_PLUS (Pmode,
3652
                                                                  addr,
3653
                                                                  inc_val)),
3654
                          insn, x, incr, addr);
3655
    }
3656
  else if (REG_P (inc_val)
3657
           && ! reg_set_between_p (inc_val, PREV_INSN (insn),
3658
                                   NEXT_INSN (incr)))
3659
 
3660
    {
3661
      if (HAVE_POST_MODIFY_REG && offset == 0)
3662
        attempt_auto_inc (pbi, gen_rtx_POST_MODIFY (Pmode, addr,
3663
                                                    gen_rtx_PLUS (Pmode,
3664
                                                                  addr,
3665
                                                                  inc_val)),
3666
                          insn, x, incr, addr);
3667
    }
3668
}
3669
 
3670
#endif /* AUTO_INC_DEC */
3671
 
3672
static void
3673
mark_used_reg (struct propagate_block_info *pbi, rtx reg,
3674
               rtx cond ATTRIBUTE_UNUSED, rtx insn)
3675
{
3676
  unsigned int regno_first, regno_last, i;
3677
  int some_was_live, some_was_dead, some_not_set;
3678
 
3679
  regno_last = regno_first = REGNO (reg);
3680
  if (regno_first < FIRST_PSEUDO_REGISTER)
3681
    regno_last += hard_regno_nregs[regno_first][GET_MODE (reg)] - 1;
3682
 
3683
  /* Find out if any of this register is live after this instruction.  */
3684
  some_was_live = some_was_dead = 0;
3685
  for (i = regno_first; i <= regno_last; ++i)
3686
    {
3687
      int needed_regno = REGNO_REG_SET_P (pbi->reg_live, i);
3688
      some_was_live |= needed_regno;
3689
      some_was_dead |= ! needed_regno;
3690
    }
3691
 
3692
  /* Find out if any of the register was set this insn.  */
3693
  some_not_set = 0;
3694
  for (i = regno_first; i <= regno_last; ++i)
3695
    some_not_set |= ! REGNO_REG_SET_P (pbi->new_set, i);
3696
 
3697
  if (pbi->flags & (PROP_LOG_LINKS | PROP_AUTOINC))
3698
    {
3699
      /* Record where each reg is used, so when the reg is set we know
3700
         the next insn that uses it.  */
3701
      pbi->reg_next_use[regno_first] = insn;
3702
    }
3703
 
3704
  if (pbi->flags & PROP_REG_INFO)
3705
    {
3706
      if (regno_first < FIRST_PSEUDO_REGISTER)
3707
        {
3708
          /* If this is a register we are going to try to eliminate,
3709
             don't mark it live here.  If we are successful in
3710
             eliminating it, it need not be live unless it is used for
3711
             pseudos, in which case it will have been set live when it
3712
             was allocated to the pseudos.  If the register will not
3713
             be eliminated, reload will set it live at that point.
3714
 
3715
             Otherwise, record that this function uses this register.  */
3716
          /* ??? The PPC backend tries to "eliminate" on the pic
3717
             register to itself.  This should be fixed.  In the mean
3718
             time, hack around it.  */
3719
 
3720
          if (! (TEST_HARD_REG_BIT (elim_reg_set, regno_first)
3721
                 && (regno_first == FRAME_POINTER_REGNUM
3722
                     || regno_first == ARG_POINTER_REGNUM)))
3723
            for (i = regno_first; i <= regno_last; ++i)
3724
              regs_ever_live[i] = 1;
3725
        }
3726
      else
3727
        {
3728
          /* Keep track of which basic block each reg appears in.  */
3729
 
3730
          int blocknum = pbi->bb->index;
3731
          if (REG_BASIC_BLOCK (regno_first) == REG_BLOCK_UNKNOWN)
3732
            REG_BASIC_BLOCK (regno_first) = blocknum;
3733
          else if (REG_BASIC_BLOCK (regno_first) != blocknum)
3734
            REG_BASIC_BLOCK (regno_first) = REG_BLOCK_GLOBAL;
3735
 
3736
          /* Count (weighted) number of uses of each reg.  */
3737
          REG_FREQ (regno_first) += REG_FREQ_FROM_BB (pbi->bb);
3738
          REG_N_REFS (regno_first)++;
3739
        }
3740
      for (i = regno_first; i <= regno_last; ++i)
3741
        if (! REGNO_REG_SET_P (pbi->reg_live, i))
3742
          {
3743
            gcc_assert (!reg_deaths[i]);
3744
            reg_deaths[i] = pbi->insn_num;
3745
          }
3746
    }
3747
 
3748
  /* Record and count the insns in which a reg dies.  If it is used in
3749
     this insn and was dead below the insn then it dies in this insn.
3750
     If it was set in this insn, we do not make a REG_DEAD note;
3751
     likewise if we already made such a note.  */
3752
  if ((pbi->flags & (PROP_DEATH_NOTES | PROP_REG_INFO))
3753
      && some_was_dead
3754
      && some_not_set)
3755
    {
3756
      /* Check for the case where the register dying partially
3757
         overlaps the register set by this insn.  */
3758
      if (regno_first != regno_last)
3759
        for (i = regno_first; i <= regno_last; ++i)
3760
          some_was_live |= REGNO_REG_SET_P (pbi->new_set, i);
3761
 
3762
      /* If none of the words in X is needed, make a REG_DEAD note.
3763
         Otherwise, we must make partial REG_DEAD notes.  */
3764
      if (! some_was_live)
3765
        {
3766
          if ((pbi->flags & PROP_DEATH_NOTES)
3767
              && ! find_regno_note (insn, REG_DEAD, regno_first))
3768
            REG_NOTES (insn)
3769
              = alloc_EXPR_LIST (REG_DEAD, reg, REG_NOTES (insn));
3770
 
3771
          if (pbi->flags & PROP_REG_INFO)
3772
            REG_N_DEATHS (regno_first)++;
3773
        }
3774
      else
3775
        {
3776
          /* Don't make a REG_DEAD note for a part of a register
3777
             that is set in the insn.  */
3778
          for (i = regno_first; i <= regno_last; ++i)
3779
            if (! REGNO_REG_SET_P (pbi->reg_live, i)
3780
                && ! dead_or_set_regno_p (insn, i))
3781
              REG_NOTES (insn)
3782
                = alloc_EXPR_LIST (REG_DEAD,
3783
                                   regno_reg_rtx[i],
3784
                                   REG_NOTES (insn));
3785
        }
3786
    }
3787
 
3788
  /* Mark the register as being live.  */
3789
  for (i = regno_first; i <= regno_last; ++i)
3790
    {
3791
#ifdef HAVE_conditional_execution
3792
      int this_was_live = REGNO_REG_SET_P (pbi->reg_live, i);
3793
#endif
3794
 
3795
      SET_REGNO_REG_SET (pbi->reg_live, i);
3796
 
3797
#ifdef HAVE_conditional_execution
3798
      /* If this is a conditional use, record that fact.  If it is later
3799
         conditionally set, we'll know to kill the register.  */
3800
      if (cond != NULL_RTX)
3801
        {
3802
          splay_tree_node node;
3803
          struct reg_cond_life_info *rcli;
3804
          rtx ncond;
3805
 
3806
          if (this_was_live)
3807
            {
3808
              node = splay_tree_lookup (pbi->reg_cond_dead, i);
3809
              if (node == NULL)
3810
                {
3811
                  /* The register was unconditionally live previously.
3812
                     No need to do anything.  */
3813
                }
3814
              else
3815
                {
3816
                  /* The register was conditionally live previously.
3817
                     Subtract the new life cond from the old death cond.  */
3818
                  rcli = (struct reg_cond_life_info *) node->value;
3819
                  ncond = rcli->condition;
3820
                  ncond = and_reg_cond (ncond, not_reg_cond (cond), 1);
3821
 
3822
                  /* If the register is now unconditionally live,
3823
                     remove the entry in the splay_tree.  */
3824
                  if (ncond == const0_rtx)
3825
                    splay_tree_remove (pbi->reg_cond_dead, i);
3826
                  else
3827
                    {
3828
                      rcli->condition = ncond;
3829
                      SET_REGNO_REG_SET (pbi->reg_cond_reg,
3830
                                         REGNO (XEXP (cond, 0)));
3831
                    }
3832
                }
3833
            }
3834
          else
3835
            {
3836
              /* The register was not previously live at all.  Record
3837
                 the condition under which it is still dead.  */
3838
              rcli = xmalloc (sizeof (*rcli));
3839
              rcli->condition = not_reg_cond (cond);
3840
              rcli->stores = const0_rtx;
3841
              rcli->orig_condition = const0_rtx;
3842
              splay_tree_insert (pbi->reg_cond_dead, i,
3843
                                 (splay_tree_value) rcli);
3844
 
3845
              SET_REGNO_REG_SET (pbi->reg_cond_reg, REGNO (XEXP (cond, 0)));
3846
            }
3847
        }
3848
      else if (this_was_live)
3849
        {
3850
          /* The register may have been conditionally live previously, but
3851
             is now unconditionally live.  Remove it from the conditionally
3852
             dead list, so that a conditional set won't cause us to think
3853
             it dead.  */
3854
          splay_tree_remove (pbi->reg_cond_dead, i);
3855
        }
3856
#endif
3857
    }
3858
}
3859
 
3860
/* Scan expression X for registers which have to be marked used in PBI.
3861
   X is considered to be the SET_DEST rtx of SET.  TRUE is returned if
3862
   X could be handled by this function.  */
3863
 
3864
static bool
3865
mark_used_dest_regs (struct propagate_block_info *pbi, rtx x, rtx cond, rtx insn)
3866
{
3867
  int regno;
3868
  bool mark_dest = false;
3869
  rtx dest = x;
3870
 
3871
  /* On some platforms calls return values spread over several
3872
     locations.  These locations are wrapped in a EXPR_LIST rtx
3873
     together with a CONST_INT offset.  */
3874
  if (GET_CODE (x) == EXPR_LIST
3875
      && GET_CODE (XEXP (x, 1)) == CONST_INT)
3876
    x = XEXP (x, 0);
3877
 
3878
  if (x == NULL_RTX)
3879
    return false;
3880
 
3881
  /* If storing into MEM, don't show it as being used.  But do
3882
     show the address as being used.  */
3883
  if (MEM_P (x))
3884
    {
3885
#ifdef AUTO_INC_DEC
3886
      if (pbi->flags & PROP_AUTOINC)
3887
        find_auto_inc (pbi, x, insn);
3888
#endif
3889
      mark_used_regs (pbi, XEXP (x, 0), cond, insn);
3890
      return true;
3891
    }
3892
 
3893
  /* Storing in STRICT_LOW_PART is like storing in a reg
3894
     in that this SET might be dead, so ignore it in TESTREG.
3895
     but in some other ways it is like using the reg.
3896
 
3897
     Storing in a SUBREG or a bit field is like storing the entire
3898
     register in that if the register's value is not used
3899
               then this SET is not needed.  */
3900
  while (GET_CODE (x) == STRICT_LOW_PART
3901
         || GET_CODE (x) == ZERO_EXTRACT
3902
         || GET_CODE (x) == SUBREG)
3903
    {
3904
#ifdef CANNOT_CHANGE_MODE_CLASS
3905
      if ((pbi->flags & PROP_REG_INFO) && GET_CODE (x) == SUBREG)
3906
        record_subregs_of_mode (x);
3907
#endif
3908
 
3909
      /* Modifying a single register in an alternate mode
3910
         does not use any of the old value.  But these other
3911
         ways of storing in a register do use the old value.  */
3912
      if (GET_CODE (x) == SUBREG
3913
          && !((REG_BYTES (SUBREG_REG (x))
3914
                + UNITS_PER_WORD - 1) / UNITS_PER_WORD
3915
               > (REG_BYTES (x)
3916
                  + UNITS_PER_WORD - 1) / UNITS_PER_WORD))
3917
        ;
3918
      else
3919
        mark_dest = true;
3920
 
3921
      x = XEXP (x, 0);
3922
    }
3923
 
3924
  /* If this is a store into a register or group of registers,
3925
     recursively scan the value being stored.  */
3926
  if (REG_P (x)
3927
      && (regno = REGNO (x),
3928
          !(regno == FRAME_POINTER_REGNUM
3929
            && (!reload_completed || frame_pointer_needed)))
3930
#if FRAME_POINTER_REGNUM != HARD_FRAME_POINTER_REGNUM
3931
      && !(regno == HARD_FRAME_POINTER_REGNUM
3932
           && (!reload_completed || frame_pointer_needed))
3933
#endif
3934
#if FRAME_POINTER_REGNUM != ARG_POINTER_REGNUM
3935
      && !(regno == ARG_POINTER_REGNUM && fixed_regs[regno])
3936
#endif
3937
      )
3938
    {
3939
      if (mark_dest)
3940
        mark_used_regs (pbi, dest, cond, insn);
3941
      return true;
3942
    }
3943
  return false;
3944
}
3945
 
3946
/* Scan expression X and store a 1-bit in NEW_LIVE for each reg it uses.
3947
   This is done assuming the registers needed from X are those that
3948
   have 1-bits in PBI->REG_LIVE.
3949
 
3950
   INSN is the containing instruction.  If INSN is dead, this function
3951
   is not called.  */
3952
 
3953
static void
3954
mark_used_regs (struct propagate_block_info *pbi, rtx x, rtx cond, rtx insn)
3955
{
3956
  RTX_CODE code;
3957
  int flags = pbi->flags;
3958
 
3959
 retry:
3960
  if (!x)
3961
    return;
3962
  code = GET_CODE (x);
3963
  switch (code)
3964
    {
3965
    case LABEL_REF:
3966
    case SYMBOL_REF:
3967
    case CONST_INT:
3968
    case CONST:
3969
    case CONST_DOUBLE:
3970
    case CONST_VECTOR:
3971
    case PC:
3972
    case ADDR_VEC:
3973
    case ADDR_DIFF_VEC:
3974
      return;
3975
 
3976
#ifdef HAVE_cc0
3977
    case CC0:
3978
      pbi->cc0_live = 1;
3979
      return;
3980
#endif
3981
 
3982
    case CLOBBER:
3983
      /* If we are clobbering a MEM, mark any registers inside the address
3984
         as being used.  */
3985
      if (MEM_P (XEXP (x, 0)))
3986
        mark_used_regs (pbi, XEXP (XEXP (x, 0), 0), cond, insn);
3987
      return;
3988
 
3989
    case MEM:
3990
      /* Don't bother watching stores to mems if this is not the
3991
         final pass.  We'll not be deleting dead stores this round.  */
3992
      if (optimize && (flags & PROP_SCAN_DEAD_STORES))
3993
        {
3994
          /* Invalidate the data for the last MEM stored, but only if MEM is
3995
             something that can be stored into.  */
3996
          if (GET_CODE (XEXP (x, 0)) == SYMBOL_REF
3997
              && CONSTANT_POOL_ADDRESS_P (XEXP (x, 0)))
3998
            /* Needn't clear the memory set list.  */
3999
            ;
4000
          else
4001
            {
4002
              rtx temp = pbi->mem_set_list;
4003
              rtx prev = NULL_RTX;
4004
              rtx next;
4005
 
4006
              while (temp)
4007
                {
4008
                  next = XEXP (temp, 1);
4009
                  if (anti_dependence (XEXP (temp, 0), x))
4010
                    {
4011
                      /* Splice temp out of the list.  */
4012
                      if (prev)
4013
                        XEXP (prev, 1) = next;
4014
                      else
4015
                        pbi->mem_set_list = next;
4016
                      free_EXPR_LIST_node (temp);
4017
                      pbi->mem_set_list_len--;
4018
                    }
4019
                  else
4020
                    prev = temp;
4021
                  temp = next;
4022
                }
4023
            }
4024
 
4025
          /* If the memory reference had embedded side effects (autoincrement
4026
             address modes.  Then we may need to kill some entries on the
4027
             memory set list.  */
4028
          if (insn)
4029
            for_each_rtx (&PATTERN (insn), invalidate_mems_from_autoinc, pbi);
4030
        }
4031
 
4032
#ifdef AUTO_INC_DEC
4033
      if (flags & PROP_AUTOINC)
4034
        find_auto_inc (pbi, x, insn);
4035
#endif
4036
      break;
4037
 
4038
    case SUBREG:
4039
#ifdef CANNOT_CHANGE_MODE_CLASS
4040
      if (flags & PROP_REG_INFO)
4041
        record_subregs_of_mode (x);
4042
#endif
4043
 
4044
      /* While we're here, optimize this case.  */
4045
      x = SUBREG_REG (x);
4046
      if (!REG_P (x))
4047
        goto retry;
4048
      /* Fall through.  */
4049
 
4050
    case REG:
4051
      /* See a register other than being set => mark it as needed.  */
4052
      mark_used_reg (pbi, x, cond, insn);
4053
      return;
4054
 
4055
    case SET:
4056
      {
4057
        rtx dest = SET_DEST (x);
4058
        int i;
4059
        bool ret = false;
4060
 
4061
        if (GET_CODE (dest) == PARALLEL)
4062
          for (i = 0; i < XVECLEN (dest, 0); i++)
4063
            ret |= mark_used_dest_regs (pbi, XVECEXP (dest, 0, i), cond, insn);
4064
        else
4065
          ret = mark_used_dest_regs (pbi, dest, cond, insn);
4066
 
4067
        if (ret)
4068
          {
4069
            mark_used_regs (pbi, SET_SRC (x), cond, insn);
4070
            return;
4071
          }
4072
      }
4073
      break;
4074
 
4075
    case ASM_OPERANDS:
4076
    case UNSPEC_VOLATILE:
4077
    case TRAP_IF:
4078
    case ASM_INPUT:
4079
      {
4080
        /* Traditional and volatile asm instructions must be considered to use
4081
           and clobber all hard registers, all pseudo-registers and all of
4082
           memory.  So must TRAP_IF and UNSPEC_VOLATILE operations.
4083
 
4084
           Consider for instance a volatile asm that changes the fpu rounding
4085
           mode.  An insn should not be moved across this even if it only uses
4086
           pseudo-regs because it might give an incorrectly rounded result.
4087
 
4088
           ?!? Unfortunately, marking all hard registers as live causes massive
4089
           problems for the register allocator and marking all pseudos as live
4090
           creates mountains of uninitialized variable warnings.
4091
 
4092
           So for now, just clear the memory set list and mark any regs
4093
           we can find in ASM_OPERANDS as used.  */
4094
        if (code != ASM_OPERANDS || MEM_VOLATILE_P (x))
4095
          {
4096
            free_EXPR_LIST_list (&pbi->mem_set_list);
4097
            pbi->mem_set_list_len = 0;
4098
          }
4099
 
4100
        /* For all ASM_OPERANDS, we must traverse the vector of input operands.
4101
           We can not just fall through here since then we would be confused
4102
           by the ASM_INPUT rtx inside ASM_OPERANDS, which do not indicate
4103
           traditional asms unlike their normal usage.  */
4104
        if (code == ASM_OPERANDS)
4105
          {
4106
            int j;
4107
 
4108
            for (j = 0; j < ASM_OPERANDS_INPUT_LENGTH (x); j++)
4109
              mark_used_regs (pbi, ASM_OPERANDS_INPUT (x, j), cond, insn);
4110
          }
4111
        break;
4112
      }
4113
 
4114
    case COND_EXEC:
4115
      gcc_assert (!cond);
4116
 
4117
      mark_used_regs (pbi, COND_EXEC_TEST (x), NULL_RTX, insn);
4118
 
4119
      cond = COND_EXEC_TEST (x);
4120
      x = COND_EXEC_CODE (x);
4121
      goto retry;
4122
 
4123
    default:
4124
      break;
4125
    }
4126
 
4127
  /* Recursively scan the operands of this expression.  */
4128
 
4129
  {
4130
    const char * const fmt = GET_RTX_FORMAT (code);
4131
    int i;
4132
 
4133
    for (i = GET_RTX_LENGTH (code) - 1; i >= 0; i--)
4134
      {
4135
        if (fmt[i] == 'e')
4136
          {
4137
            /* Tail recursive case: save a function call level.  */
4138
            if (i == 0)
4139
              {
4140
                x = XEXP (x, 0);
4141
                goto retry;
4142
              }
4143
            mark_used_regs (pbi, XEXP (x, i), cond, insn);
4144
          }
4145
        else if (fmt[i] == 'E')
4146
          {
4147
            int j;
4148
            for (j = 0; j < XVECLEN (x, i); j++)
4149
              mark_used_regs (pbi, XVECEXP (x, i, j), cond, insn);
4150
          }
4151
      }
4152
  }
4153
}
4154
 
4155
#ifdef AUTO_INC_DEC
4156
 
4157
static int
4158
try_pre_increment_1 (struct propagate_block_info *pbi, rtx insn)
4159
{
4160
  /* Find the next use of this reg.  If in same basic block,
4161
     make it do pre-increment or pre-decrement if appropriate.  */
4162
  rtx x = single_set (insn);
4163
  HOST_WIDE_INT amount = ((GET_CODE (SET_SRC (x)) == PLUS ? 1 : -1)
4164
                          * INTVAL (XEXP (SET_SRC (x), 1)));
4165
  int regno = REGNO (SET_DEST (x));
4166
  rtx y = pbi->reg_next_use[regno];
4167
  if (y != 0
4168
      && SET_DEST (x) != stack_pointer_rtx
4169
      && BLOCK_NUM (y) == BLOCK_NUM (insn)
4170
      /* Don't do this if the reg dies, or gets set in y; a standard addressing
4171
         mode would be better.  */
4172
      && ! dead_or_set_p (y, SET_DEST (x))
4173
      && try_pre_increment (y, SET_DEST (x), amount))
4174
    {
4175
      /* We have found a suitable auto-increment and already changed
4176
         insn Y to do it.  So flush this increment instruction.  */
4177
      propagate_block_delete_insn (insn);
4178
 
4179
      /* Count a reference to this reg for the increment insn we are
4180
         deleting.  When a reg is incremented, spilling it is worse,
4181
         so we want to make that less likely.  */
4182
      if (regno >= FIRST_PSEUDO_REGISTER)
4183
        {
4184
          REG_FREQ (regno) += REG_FREQ_FROM_BB (pbi->bb);
4185
          REG_N_SETS (regno)++;
4186
        }
4187
 
4188
      /* Flush any remembered memories depending on the value of
4189
         the incremented register.  */
4190
      invalidate_mems_from_set (pbi, SET_DEST (x));
4191
 
4192
      return 1;
4193
    }
4194
  return 0;
4195
}
4196
 
4197
/* Try to change INSN so that it does pre-increment or pre-decrement
4198
   addressing on register REG in order to add AMOUNT to REG.
4199
   AMOUNT is negative for pre-decrement.
4200
   Returns 1 if the change could be made.
4201
   This checks all about the validity of the result of modifying INSN.  */
4202
 
4203
static int
4204
try_pre_increment (rtx insn, rtx reg, HOST_WIDE_INT amount)
4205
{
4206
  rtx use;
4207
 
4208
  /* Nonzero if we can try to make a pre-increment or pre-decrement.
4209
     For example, addl $4,r1; movl (r1),... can become movl +(r1),...  */
4210
  int pre_ok = 0;
4211
  /* Nonzero if we can try to make a post-increment or post-decrement.
4212
     For example, addl $4,r1; movl -4(r1),... can become movl (r1)+,...
4213
     It is possible for both PRE_OK and POST_OK to be nonzero if the machine
4214
     supports both pre-inc and post-inc, or both pre-dec and post-dec.  */
4215
  int post_ok = 0;
4216
 
4217
  /* Nonzero if the opportunity actually requires post-inc or post-dec.  */
4218
  int do_post = 0;
4219
 
4220
  /* From the sign of increment, see which possibilities are conceivable
4221
     on this target machine.  */
4222
  if (HAVE_PRE_INCREMENT && amount > 0)
4223
    pre_ok = 1;
4224
  if (HAVE_POST_INCREMENT && amount > 0)
4225
    post_ok = 1;
4226
 
4227
  if (HAVE_PRE_DECREMENT && amount < 0)
4228
    pre_ok = 1;
4229
  if (HAVE_POST_DECREMENT && amount < 0)
4230
    post_ok = 1;
4231
 
4232
  if (! (pre_ok || post_ok))
4233
    return 0;
4234
 
4235
  /* It is not safe to add a side effect to a jump insn
4236
     because if the incremented register is spilled and must be reloaded
4237
     there would be no way to store the incremented value back in memory.  */
4238
 
4239
  if (JUMP_P (insn))
4240
    return 0;
4241
 
4242
  use = 0;
4243
  if (pre_ok)
4244
    use = find_use_as_address (PATTERN (insn), reg, 0);
4245
  if (post_ok && (use == 0 || use == (rtx) (size_t) 1))
4246
    {
4247
      use = find_use_as_address (PATTERN (insn), reg, -amount);
4248
      do_post = 1;
4249
    }
4250
 
4251
  if (use == 0 || use == (rtx) (size_t) 1)
4252
    return 0;
4253
 
4254
  if (GET_MODE_SIZE (GET_MODE (use)) != (amount > 0 ? amount : - amount))
4255
    return 0;
4256
 
4257
  /* See if this combination of instruction and addressing mode exists.  */
4258
  if (! validate_change (insn, &XEXP (use, 0),
4259
                         gen_rtx_fmt_e (amount > 0
4260
                                        ? (do_post ? POST_INC : PRE_INC)
4261
                                        : (do_post ? POST_DEC : PRE_DEC),
4262
                                        Pmode, reg), 0))
4263
    return 0;
4264
 
4265
  /* Record that this insn now has an implicit side effect on X.  */
4266
  REG_NOTES (insn) = alloc_EXPR_LIST (REG_INC, reg, REG_NOTES (insn));
4267
  return 1;
4268
}
4269
 
4270
#endif /* AUTO_INC_DEC */
4271
 
4272
/* Find the place in the rtx X where REG is used as a memory address.
4273
   Return the MEM rtx that so uses it.
4274
   If PLUSCONST is nonzero, search instead for a memory address equivalent to
4275
   (plus REG (const_int PLUSCONST)).
4276
 
4277
   If such an address does not appear, return 0.
4278
   If REG appears more than once, or is used other than in such an address,
4279
   return (rtx) 1.  */
4280
 
4281
rtx
4282
find_use_as_address (rtx x, rtx reg, HOST_WIDE_INT plusconst)
4283
{
4284
  enum rtx_code code = GET_CODE (x);
4285
  const char * const fmt = GET_RTX_FORMAT (code);
4286
  int i;
4287
  rtx value = 0;
4288
  rtx tem;
4289
 
4290
  if (code == MEM && XEXP (x, 0) == reg && plusconst == 0)
4291
    return x;
4292
 
4293
  if (code == MEM && GET_CODE (XEXP (x, 0)) == PLUS
4294
      && XEXP (XEXP (x, 0), 0) == reg
4295
      && GET_CODE (XEXP (XEXP (x, 0), 1)) == CONST_INT
4296
      && INTVAL (XEXP (XEXP (x, 0), 1)) == plusconst)
4297
    return x;
4298
 
4299
  if (code == SIGN_EXTRACT || code == ZERO_EXTRACT)
4300
    {
4301
      /* If REG occurs inside a MEM used in a bit-field reference,
4302
         that is unacceptable.  */
4303
      if (find_use_as_address (XEXP (x, 0), reg, 0) != 0)
4304
        return (rtx) (size_t) 1;
4305
    }
4306
 
4307
  if (x == reg)
4308
    return (rtx) (size_t) 1;
4309
 
4310
  for (i = GET_RTX_LENGTH (code) - 1; i >= 0; i--)
4311
    {
4312
      if (fmt[i] == 'e')
4313
        {
4314
          tem = find_use_as_address (XEXP (x, i), reg, plusconst);
4315
          if (value == 0)
4316
            value = tem;
4317
          else if (tem != 0)
4318
            return (rtx) (size_t) 1;
4319
        }
4320
      else if (fmt[i] == 'E')
4321
        {
4322
          int j;
4323
          for (j = XVECLEN (x, i) - 1; j >= 0; j--)
4324
            {
4325
              tem = find_use_as_address (XVECEXP (x, i, j), reg, plusconst);
4326
              if (value == 0)
4327
                value = tem;
4328
              else if (tem != 0)
4329
                return (rtx) (size_t) 1;
4330
            }
4331
        }
4332
    }
4333
 
4334
  return value;
4335
}
4336
 
4337
/* Write information about registers and basic blocks into FILE.
4338
   This is part of making a debugging dump.  */
4339
 
4340
void
4341
dump_regset (regset r, FILE *outf)
4342
{
4343
  unsigned i;
4344
  reg_set_iterator rsi;
4345
 
4346
  if (r == NULL)
4347
    {
4348
      fputs (" (nil)", outf);
4349
      return;
4350
    }
4351
 
4352
  EXECUTE_IF_SET_IN_REG_SET (r, 0, i, rsi)
4353
    {
4354
      fprintf (outf, " %d", i);
4355
      if (i < FIRST_PSEUDO_REGISTER)
4356
        fprintf (outf, " [%s]",
4357
                 reg_names[i]);
4358
    }
4359
}
4360
 
4361
/* Print a human-readable representation of R on the standard error
4362
   stream.  This function is designed to be used from within the
4363
   debugger.  */
4364
 
4365
void
4366
debug_regset (regset r)
4367
{
4368
  dump_regset (r, stderr);
4369
  putc ('\n', stderr);
4370
}
4371
 
4372
/* Recompute register set/reference counts immediately prior to register
4373
   allocation.
4374
 
4375
   This avoids problems with set/reference counts changing to/from values
4376
   which have special meanings to the register allocators.
4377
 
4378
   Additionally, the reference counts are the primary component used by the
4379
   register allocators to prioritize pseudos for allocation to hard regs.
4380
   More accurate reference counts generally lead to better register allocation.
4381
 
4382
   It might be worthwhile to update REG_LIVE_LENGTH, REG_BASIC_BLOCK and
4383
   possibly other information which is used by the register allocators.  */
4384
 
4385
void
4386
recompute_reg_usage (void)
4387
{
4388
  allocate_reg_life_data ();
4389
  /* distribute_notes in combiner fails to convert some of the
4390
     REG_UNUSED notes to REG_DEAD notes.  This causes CHECK_DEAD_NOTES
4391
     in sched1 to die.  To solve this update the DEATH_NOTES
4392
     here.  */
4393
  update_life_info (NULL, UPDATE_LIFE_LOCAL, PROP_REG_INFO | PROP_DEATH_NOTES);
4394
 
4395
  if (dump_file)
4396
    dump_flow_info (dump_file);
4397
}
4398
 
4399
struct tree_opt_pass pass_recompute_reg_usage =
4400
{
4401
  "life2",                              /* name */
4402
  NULL,                                 /* gate */
4403
  recompute_reg_usage,                  /* execute */
4404
  NULL,                                 /* sub */
4405
  NULL,                                 /* next */
4406
  0,                                    /* static_pass_number */
4407
  0,                                    /* tv_id */
4408
  0,                                    /* properties_required */
4409
  0,                                    /* properties_provided */
4410
  0,                                    /* properties_destroyed */
4411
  0,                                    /* todo_flags_start */
4412
  TODO_dump_func,                       /* todo_flags_finish */
4413
  'f'                                   /* letter */
4414
};
4415
 
4416
/* Optionally removes all the REG_DEAD and REG_UNUSED notes from a set of
4417
   blocks.  If BLOCKS is NULL, assume the universal set.  Returns a count
4418
   of the number of registers that died.  */
4419
 
4420
int
4421
count_or_remove_death_notes (sbitmap blocks, int kill)
4422
{
4423
  int count = 0;
4424
  unsigned int i = 0;
4425
  basic_block bb;
4426
 
4427
  /* This used to be a loop over all the blocks with a membership test
4428
     inside the loop.  That can be amazingly expensive on a large CFG
4429
     when only a small number of bits are set in BLOCKs (for example,
4430
     the calls from the scheduler typically have very few bits set).
4431
 
4432
     For extra credit, someone should convert BLOCKS to a bitmap rather
4433
     than an sbitmap.  */
4434
  if (blocks)
4435
    {
4436
      sbitmap_iterator sbi;
4437
 
4438
      EXECUTE_IF_SET_IN_SBITMAP (blocks, 0, i, sbi)
4439
        {
4440
          count += count_or_remove_death_notes_bb (BASIC_BLOCK (i), kill);
4441
        };
4442
    }
4443
  else
4444
    {
4445
      FOR_EACH_BB (bb)
4446
        {
4447
          count += count_or_remove_death_notes_bb (bb, kill);
4448
        }
4449
    }
4450
 
4451
  return count;
4452
}
4453
 
4454
/* Optionally removes all the REG_DEAD and REG_UNUSED notes from basic
4455
   block BB.  Returns a count of the number of registers that died.  */
4456
 
4457
static int
4458
count_or_remove_death_notes_bb (basic_block bb, int kill)
4459
{
4460
  int count = 0;
4461
  rtx insn;
4462
 
4463
  for (insn = BB_HEAD (bb); ; insn = NEXT_INSN (insn))
4464
    {
4465
      if (INSN_P (insn))
4466
        {
4467
          rtx *pprev = &REG_NOTES (insn);
4468
          rtx link = *pprev;
4469
 
4470
          while (link)
4471
            {
4472
              switch (REG_NOTE_KIND (link))
4473
                {
4474
                case REG_DEAD:
4475
                  if (REG_P (XEXP (link, 0)))
4476
                    {
4477
                      rtx reg = XEXP (link, 0);
4478
                      int n;
4479
 
4480
                      if (REGNO (reg) >= FIRST_PSEUDO_REGISTER)
4481
                        n = 1;
4482
                      else
4483
                        n = hard_regno_nregs[REGNO (reg)][GET_MODE (reg)];
4484
                      count += n;
4485
                    }
4486
 
4487
                  /* Fall through.  */
4488
 
4489
                case REG_UNUSED:
4490
                  if (kill)
4491
                    {
4492
                      rtx next = XEXP (link, 1);
4493
                      free_EXPR_LIST_node (link);
4494
                      *pprev = link = next;
4495
                      break;
4496
                    }
4497
                  /* Fall through.  */
4498
 
4499
                default:
4500
                  pprev = &XEXP (link, 1);
4501
                  link = *pprev;
4502
                  break;
4503
                }
4504
            }
4505
        }
4506
 
4507
      if (insn == BB_END (bb))
4508
        break;
4509
    }
4510
 
4511
  return count;
4512
}
4513
 
4514
/* Clear LOG_LINKS fields of insns in a selected blocks or whole chain
4515
   if blocks is NULL.  */
4516
 
4517
static void
4518
clear_log_links (sbitmap blocks)
4519
{
4520
  rtx insn;
4521
 
4522
  if (!blocks)
4523
    {
4524
      for (insn = get_insns (); insn; insn = NEXT_INSN (insn))
4525
        if (INSN_P (insn))
4526
          free_INSN_LIST_list (&LOG_LINKS (insn));
4527
    }
4528
  else
4529
    {
4530
      unsigned int i = 0;
4531
      sbitmap_iterator sbi;
4532
 
4533
      EXECUTE_IF_SET_IN_SBITMAP (blocks, 0, i, sbi)
4534
        {
4535
          basic_block bb = BASIC_BLOCK (i);
4536
 
4537
          for (insn = BB_HEAD (bb); insn != NEXT_INSN (BB_END (bb));
4538
               insn = NEXT_INSN (insn))
4539
            if (INSN_P (insn))
4540
              free_INSN_LIST_list (&LOG_LINKS (insn));
4541
        }
4542
    }
4543
}
4544
 
4545
/* Given a register bitmap, turn on the bits in a HARD_REG_SET that
4546
   correspond to the hard registers, if any, set in that map.  This
4547
   could be done far more efficiently by having all sorts of special-cases
4548
   with moving single words, but probably isn't worth the trouble.  */
4549
 
4550
void
4551
reg_set_to_hard_reg_set (HARD_REG_SET *to, bitmap from)
4552
{
4553
  unsigned i;
4554
  bitmap_iterator bi;
4555
 
4556
  EXECUTE_IF_SET_IN_BITMAP (from, 0, i, bi)
4557
    {
4558
      if (i >= FIRST_PSEUDO_REGISTER)
4559
        return;
4560
      SET_HARD_REG_BIT (*to, i);
4561
    }
4562
}
4563
 
4564
 
4565
static bool
4566
gate_remove_death_notes (void)
4567
{
4568
  return flag_profile_values;
4569
}
4570
 
4571
static void
4572
rest_of_handle_remove_death_notes (void)
4573
{
4574
  count_or_remove_death_notes (NULL, 1);
4575
}
4576
 
4577
struct tree_opt_pass pass_remove_death_notes =
4578
{
4579
  "ednotes",                            /* name */
4580
  gate_remove_death_notes,              /* gate */
4581
  rest_of_handle_remove_death_notes,    /* execute */
4582
  NULL,                                 /* sub */
4583
  NULL,                                 /* next */
4584
  0,                                    /* static_pass_number */
4585
  0,                                    /* tv_id */
4586
  0,                                    /* properties_required */
4587
  0,                                    /* properties_provided */
4588
  0,                                    /* properties_destroyed */
4589
  0,                                    /* todo_flags_start */
4590
  0,                                    /* todo_flags_finish */
4591
 
4592
};
4593
 
4594
/* Perform life analysis.  */
4595
static void
4596
rest_of_handle_life (void)
4597
{
4598
  regclass_init ();
4599
 
4600
  life_analysis (dump_file, PROP_FINAL);
4601
  if (optimize)
4602
    cleanup_cfg (CLEANUP_EXPENSIVE | CLEANUP_UPDATE_LIFE | CLEANUP_LOG_LINKS
4603
                 | (flag_thread_jumps ? CLEANUP_THREADING : 0));
4604
 
4605
  if (extra_warnings)
4606
    {
4607
      setjmp_vars_warning (DECL_INITIAL (current_function_decl));
4608
      setjmp_args_warning ();
4609
    }
4610
 
4611
  if (optimize)
4612
    {
4613
      if (initialize_uninitialized_subregs ())
4614
        {
4615
          /* Insns were inserted, and possibly pseudos created, so
4616
             things might look a bit different.  */
4617
          allocate_reg_life_data ();
4618
          update_life_info (NULL, UPDATE_LIFE_GLOBAL_RM_NOTES,
4619
                            PROP_LOG_LINKS | PROP_REG_INFO | PROP_DEATH_NOTES);
4620
        }
4621
    }
4622
 
4623
  no_new_pseudos = 1;
4624
}
4625
 
4626
struct tree_opt_pass pass_life =
4627
{
4628
  "life1",                              /* name */
4629
  NULL,                                 /* gate */
4630
  rest_of_handle_life,                  /* execute */
4631
  NULL,                                 /* sub */
4632
  NULL,                                 /* next */
4633
  0,                                    /* static_pass_number */
4634
  TV_FLOW,                              /* tv_id */
4635
  0,                                    /* properties_required */
4636
  0,                                    /* properties_provided */
4637
  0,                                    /* properties_destroyed */
4638
  TODO_verify_flow,                     /* todo_flags_start */
4639
  TODO_dump_func |
4640
  TODO_ggc_collect,                     /* todo_flags_finish */
4641
  'f'                                   /* letter */
4642
};
4643
 
4644
static void
4645
rest_of_handle_flow2 (void)
4646
{
4647
  /* If optimizing, then go ahead and split insns now.  */
4648
#ifndef STACK_REGS
4649
  if (optimize > 0)
4650
#endif
4651
    split_all_insns (0);
4652
 
4653
  if (flag_branch_target_load_optimize)
4654
    branch_target_load_optimize (epilogue_completed);
4655
 
4656
  if (optimize)
4657
    cleanup_cfg (CLEANUP_EXPENSIVE);
4658
 
4659
  /* On some machines, the prologue and epilogue code, or parts thereof,
4660
     can be represented as RTL.  Doing so lets us schedule insns between
4661
     it and the rest of the code and also allows delayed branch
4662
     scheduling to operate in the epilogue.  */
4663
  thread_prologue_and_epilogue_insns (get_insns ());
4664
  epilogue_completed = 1;
4665
  flow2_completed = 1;
4666
}
4667
 
4668
struct tree_opt_pass pass_flow2 =
4669
{
4670
  "flow2",                              /* name */
4671
  NULL,                                 /* gate */
4672
  rest_of_handle_flow2,                 /* execute */
4673
  NULL,                                 /* sub */
4674
  NULL,                                 /* next */
4675
  0,                                    /* static_pass_number */
4676
  TV_FLOW2,                             /* tv_id */
4677
  0,                                    /* properties_required */
4678
  0,                                    /* properties_provided */
4679
  0,                                    /* properties_destroyed */
4680
  TODO_verify_flow,                     /* todo_flags_start */
4681
  TODO_dump_func |
4682
  TODO_ggc_collect,                     /* todo_flags_finish */
4683
  'w'                                   /* letter */
4684
};
4685
 

powered by: WebSVN 2.1.0

© copyright 1999-2024 OpenCores.org, equivalent to Oliscience, all rights reserved. OpenCores®, registered trademark.